I keep getting a message in my Google Webmaster Tools that my robots.txt is blocking all the links for my website. Here robots.txt looks like this: User-agent: Googlebot-Image Disallow: / User-agent: * Allow: / And I don't have it checked to discourage search engines in the dashboard. I even checked the pages for no-index/no-follow. I even tried deactivating my one plugin which is a Google Sitemap plugin. Any ideas?
i don't understand what are you trying to do, this may help you http://yoast.com/example-robots-txt-wordpress/
User-agent: Googlebot-Image stops google from including stuff in image search. Unless its NSFW that is not a real sound idea. Nigel
Thank you for the help. I think I got it figured it now. Seems like it was all more of a propagation issue with Google or something. I tried resubmitting today and there were no errors. It might have been because I was originally blocking search engines while in development.
There are plugins for WordPress that will dynamically generate a robots.txt when requested, I would strongly advise against using them but create an actual robots.txt file, due to the additional load created on the WordPress engine.