hi there i really need help with this small problem: 1 - i have a blog called: www.something.com 2 - i have created a sudbomain called: cars.something.com 3 - i use wordpress in both blogs and a plugin to create sitemap 4 - the sitemap works just fine with: www.something.com 5 - the sitemap at google gives me an error sayin that it is blocked by robots.txt and i didnt even have any "robots.txt" in the folder of the sudbomain (car.something.com). so after that i added one "robots.txt" copied from main domain (www.something.com) . still same stupid error. any idea? help? anything apreciated. im totally lost here...
Random thought but have you verified the contents of your robots file versus the google validator? We had an issue recently with a minor change in the robots file causing some problems with our portfolio group. Verify the content of the robots file first, use a trial version of a sitemap generator second, or xml sitemaps online. Good luck.
Just go to the url http://www.robotstxt.org/ and see how a robots.txt file is created. Information regarding allowing or blocking the bot / crawler / spider to any page of your site is discussed there. Hope you would find that useful.