I have a small wordpress blog that I hadn't checked for a while, and now I see that it's google traffic has dropped to zero. But I can still find it in the listings if I use the site:zzzzz.com search method. So I checked in my google WM tools and the crawler is getting 403 errors. I check robots.txt and yes... it has been set to deny all. I am 99% sure that someone has hacked in to it somehow, possibly because I had some nice rankings for a few phrases that made money. Anyway, I have been spending hours trying to fix this but no matter what I do, if I try to "Fetch as googlebot" in the webmaster tools I still get the same error (403). There was no robots.txt file on my site originally, but I found out that wordpress makes a virtual robots.txt. I tried updating to the latest wordpress version. I tried disabling all plugins. Didn't fix it. I tried installing the XML sitemaps plugin - that seemed to work... it made the following virtual robots.txt: --------------------- User-agent: * Disallow: Sitemap: http://www.zzzzzzzz.net/sitemap.xml.gz --------------------- I even tried making my own robots.txt and uploading it. But I am still getting an error when trying to get the crawler to visit my site. 403. I'm getting really frustrated here... hours and hours of searching for solutions and I've got nothing. Please help. EDIT: Also, I'm getting a red X error mark when trying to add the new sitemap to WM tools, and my old sitemap (which was my RSS feed) is now X'd as well. From the looks of it I have been getting zero traffic from google for 2-3 months, so it's not a temporary thing as far as I can tell.
I didn't even have a robots.txt until this happened. That is why I think someone possibly exploited wordpress to change my virtual robots.txt file. And yes, after I thought I got it fixed with a clean robots.txt I used the Google Labs tool to check it and it still gives an error and it can't crawl my site.
Well wordpress is vulnerable to hackers. Check all of your main files. index.php, .htaccess, etc. Try to look for something that is obviously making your blog to go haywire. Can't help you much out with this one, try asking help from a professional, it may cost but it should be someone you can trust with your websites accesses and have them check your website.
OK thanks anyway. My files all look clean, and I've had to deal with wordpress hacks before so I have an idea where to look for these things... but this time it has got me very frustrated. Is it possible that the Google Labs "fetch as googlebot" tool is reading a cached version of my robots.txt or something like that?