Hi, I have started to write a robots.txt checker. You simply enter an url and it will check the robots.txt file and makes it so you can easily check what the content of it was. You can select each spider/bot that has been mentioned in the file and see the following: Pages Disallowed Pages Allowed Sitemaps Allowed Visit Times Crawl Rates Page Request Rate I have loads of other ideas what I will be putting into this so changes will happen soon.