When googlebot crawls my site http://www.lastvj.com it stops at the index page. This started happening about a month ago before that it was crawling regularly and my site was showing up in search results. I have a site map and a robots.txt. What am I doing wrong?
Correct your robots.txt file.... User-agent: * Disallow: sitemap: [absolute URL path to XML sitemap] Code (markup):
Thanks, I just did some work on it the other day and must have uploaded the old robots.txt file by accident. Part of the problem is that googlebot doesn't even bother looking at looking at the robots.txt. Any tips?
It's still a good idea regardless especially since having one (even if you don't use it) will cut down on the number of "false positives" in your server's error logs. Same with a sitemap.xml file and a favicon.ico file.
I have uploaded a screen cap of my crawl stats. It shows hits from google but no successful hits on the robots.txt file. http://www.lastvj.com/crawlstats.jpg
Is there a chance that it has been sand-boxed? I mean absolutely no pages show up in google. I was getting alot of traffic from google before the May 12th and then it just stopped.