Hello, I added a sitemap and a robot.txt file to my wordpress blog, but in google webmasters tools show me this message: "Some important page is blocked by robots.txt". And when I check the blocked url, it send me to the homepage. And my second issue is this one: In sitemap section show me this: Sitemap------------------- Status ---- URLs in web index /sitemapindex.xml ----------- X Also no data available for "keywords" and "links to my site" in GWT, maybe google not recently updated the sitemap and/or robots.txt of my website? Any advice is welcome, you can check the link in my sign. Thanks, masterb.
you don't need to add a robot.txt to your wordpress as workpress does it on the fly. you will need to get a plugin if you want to exclude any pages. If you use a sitemap plugin. which will update the site when the pages update. the plugin will also update the robot.txt with the location of the sitemap. The plugin I use is Google XML Sitemaps. if you manually create a site map I would remove them and try this plugin. let me know how it goes
Well google do not want you to block it from any of your pages having content on it , so you can ignore the message if you are doing the things right or you can remove the pages that you have blocked in robots so that 404 will come when some one open the pages this page does not exist message will come and then remove them from robots file
Robots.txt comes with Wordpress, you do not need to create it separately. Use the XML sitemap plugin if you have created sitemap manually.