Hi, I have a strange question. Does Google chache the robots.txt and updates the chached version only from time to time? Reason: I`ve built a website. I developed online, that`s why the website started out with a robots.txt disallowing indexing of the website completely. Now I have trouble getting the website in the index. In webmaster tools I can see that Google downloaded my robots.txt at least once every 24h but I get errors telling me the pages in my sitemap are blocked in my robots.txt. This insanity is going on for 4 days now... I don`t get it. None of my pages are in the blocked folders. Disallow: /401 Disallow: /403 Disallow: /404 Disallow: /405 Disallow: /about Disallow: /contact Disallow: /css Disallow: /feeds Disallow: /img Disallow: /inc Disallow: /sys Disallow: /*.css$ Disallow: /*.html$ Disallow: /*.pdf$ Disallow: /*.txt$ Disallow: /*.xml$ I developed in .php Any ideas or similar experiences? I`m curious.