Webmaster Central Sitemap Errors Please help!

Discussion in 'Google Sitemaps' started by majestic12, Apr 6, 2009.

  1. #1
    Please can someone help me..


    I have submitted about 5 sites to webmaster central tools, added sitemaps but 2 sites keep failing the sitemap validation..

    The message keeps saying

    Network unreachable: robots.txt unreachable
    We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely.

    these are my robots and sitemaps..

    [ www. removed to keep these links in the post dead ]

    conservatorywindowblinds.co.uk/robots.txt
    conservatorywindowblinds.co.uk/sitemap.xml

    I cannot for the life of me see that these are any different to the sites that did pass sitemap validation..

    I would be really greateful if someone can tell me what I have done wrong as this is driving me crazy..

    Kindest regards David
     
    majestic12, Apr 6, 2009 IP
  2. norbert

    norbert Guest

    Messages:
    61
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #2
    You can validate your sitemap yourself and check the errors.

    If I do this I get:
    
    ./sitemap.xml:1: element urlset: Schemas validity error : Element '{http://www.sitemaps.org/schemas/sitemap/0.9}urlset': No matching global declaration available for the validation root.
    ./sitemap.xml fails to validate
    
    Code (markup):
    So, change the last xmlns of the line at the top that starts with <urlset
    to the google schema, then it works:
    
    <urlset xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd" xmlns="http://www.google.com/schemas/sitemap/0.84">
    
    Code (markup):
     
    norbert, Apr 8, 2009 IP
  3. majestic12

    majestic12 Peon

    Messages:
    72
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Thanks Norbert


    Luckily I have have solved my problem, after talking to my host "3 times" they discovered the permissions on the Robots.txt file were set wrong.. they changed them to 777 and everything was fixed..


    Many thanks for your input...
     
    majestic12, Apr 8, 2009 IP