hi everyone...quick question...hopefully a quick resolve. i'm wondering why google webmaster is giving me an X error when i submit my sitemap.xml file. here's the error i'm getting. URL restricted by robots.txt We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit. so it's saying the sitemap.xml is restricted by robots.txt, additionally, so is my index page. here's the other thing gw says... Googlebot is blocked from http://www.downtownlalaw.com/ my robots.txt file has... User-agent: * Allow: / ...so what's the issue? i'm allowing googlebot to view all my pages.
perhaps, you should modify your sitemap.xml with another similar plugin, try it, i ever done with this
modify my sitemap? there's nothing wrong with my sitemap. i validated the sitemap here http://www.validome.org/google/ and it approved.
If so, try to change the code of your robots.txt, or delete your robots.txt until your sitemap accepted by them
i deleted the robots.txt file from the main root of my server and then re-submitted the sitmap.xml link and still got that stupid X. i don't get it...why is this happening??
does someone know what's up with google webmasters or if i can even contact a google rep? i can't figure this out... robots.txt and sitemap.xml are all fine. not sure why it's giving me that X
Your robots.txt is wrong. There is no "allow" instruction. Check http://www.robotstxt.org/robotstxt.html
thanks guys...it worked...google just took some time to update the sitemap i guess...thanks for your help everyone
I gues its the code compatability. try to recode it in the sense that it coincide witht he plugins or the platforms your using.