I am looking to construct web pages in a way that search engines will crawl it best. I understand the basic tagging and on page SEO keyword related material. I am looking for something more in the way the code is constructed or possible commands that can be given from the server or from more advanced programming so that pages can be crawled the best. Any links, references, guidelines, articles, and subjects on what I am looking for would be very helpful and very much appreciated.
I don't think so there is any method and tricks to crawl a website fast. It depends on your website how it is important and having unique content. Don't use so much codes and java scripts in the web pages search engines doesn't crawl them.
I think that a "perfect way" to code web pages for SEO doesn't exist. But there's a lot of little tricks that can help your web pages to be indexed better like the weight of the page (delete everything that is not used in your css, compress css files and js files etc.), put your js calls at the end of your pages (just before the ending </body>), organize your content to put the more important in the top of your code etc.
What is the advantage in having the JS just before the body tag? I have heard this many times before but I never retained the exact advantage in it
By putting the js calls just before the ending body tag you let everything else in your webpage loading first, so the content. And so the crawl is a little faster.
You should go for basic onpage submission part , including the title , keywords, description, h1 in the title of the content part, search engine submission & few more basic seo tasks.
Sure, of course ! I wouldn't have sent an advice that I won't have tried before. Moreover I'm applying this on all my webpages so I know what I'm speaking about