Dear DP Members, I am having a problem with wordpress robot.txt file. If I choose wordpress default setting of "Allow Robots" under setting, the webmaster tools dashboard shows that Google bot and Other search engine robots are disallowed from indexing my blog. If I upload my custom Robot.txt file to my server, it solves the problem. Now the problem is I am using Plantinum SEO pack and I have choosen the customized settings for what to be indexed and what not? If I upload custom robot.txt file to my server, it will allow search bots to index everything. How do I correct the problem with wordpress default robot.txt file? Why it is showing search engines disallowed when I have actually checked the allowed box? My blog is here Environment About
Your robots.txt file is allowing access to all robots right now. What is the problem? User-agent: * Allow: / Sitemap: http://www.environmentabout.com/sitemap.xml.gz Code (markup):
The Robots.txt file that you are seeing right now is the custom one that I have uploaded to my server. If I do not use the robots.txt file, the default wordpress Robots.txt file shows "Disallow" in the Google webmaster tools even when I have enabled the allowed option in settings. Now the problem is if I use this custom robots.txt file, the serach bots will index everything on my website. I am using Platinum SEO pack and its benefits will be lost since I have customized settings for indexed locations and non indexed and no follow locations.
There may be a delay in what you are seeing in Google Webmaster Tools, so do not assume that is what is actually in your current robots.txt file. Googlebot will have to refetch your current robots.txt file, then it should appear correctly in GWT.
Finally I created my customized robots.txt file and uploaded it to my server. It resolved the problem.