Robots.txt and sitemap.xml conflict

Discussion in 'WordPress' started by 87Lakers, Apr 28, 2009.

  1. #1
    I am using the Robots-Meta plugin as well as a robots.txt file along with the XML-Sitemaps plugin. Now Google is showing WARNINGS next to each site along with this message:

    "When we tested a sample of the URLs from your Sitemap, we found that the site's robots.txt file was blocking access to some of the URLs. If you don't intend to block some of the URLs contained in the Sitemap, please use our robots.txt analysis tool to verify that the URLs you submitted in your Sitemap are accessible by Googlebot. All accessible URLs will still be submitted."

    So do I ignore it? Or do I delete all the robots.txt files and just go with the Robots-Meta plugin?
     
    87Lakers, Apr 28, 2009 IP
  2. myp

    myp Well-Known Member

    Messages:
    1,281
    Likes Received:
    71
    Best Answers:
    0
    Trophy Points:
    140
    #2
    What a robots.txt does is tell search engine bots to do certain things. In your case is it telling them not to index certain pages. Your sitemap contains some of those same pages and since submitting a sitemap to Google essentially means you want the pages in it indexed, Google is just letting you know that the robots.txt is stopping some of those pages from being indexed. Look at your robots.txt and if you want the disallowed pages to not be indexed then leave it as it is, otherwise change it.

    For more info on robots.txt check out: http://www.robotstxt.org/robotstxt.html
     
    myp, Apr 28, 2009 IP
  3. hmansfield

    hmansfield Guest

    Messages:
    7,904
    Likes Received:
    298
    Best Answers:
    0
    Trophy Points:
    280
    #3
    Do you need both the plug in and the robots.txt file ? Since they both essentially do the same thing, I would have to assume one is conflicting with the other.


    Robots Meta Plug in :
    I think one or the other will suffice, Google will not forget you :)
     
    hmansfield, Apr 28, 2009 IP
  4. denharsh

    denharsh Peon

    Messages:
    167
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #4
    That's perfectly fine, just make sure your robots meta plugin is not blocking anything which should not be marked as "Disallow".
    You can easily check that by following the instruction with that message.
     
    denharsh, Apr 28, 2009 IP
  5. 87Lakers

    87Lakers Active Member

    Messages:
    162
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    60
    #5
    Thanks guys - you were all correct! The robots.txt was designed by a user who used tags and so had it set to block categories, which I use. I deleted that command from the robots.txt and Google has already dropped all the errors!
     
    87Lakers, Apr 28, 2009 IP