hello Im developing a free tool to create robots.txt. Features: 1.- 300+ robots in database. You can pick as much as you want to create specific restrictions to any one or to all. 2.- Have bot search to check info about the 300 bots in database to compare with some unkown bot you may see on your logs. 3.- You can set creal delay (how much your site can be crawled, per example each 60 seconds) etc. I need BETA testers. I am developer and this is not the firt tool i give for free in DP. Screenshots: Main page to create robots.txt Bots database Bot search. The file will be freeware for ever. The website its not finished yet (sorry). But you can download it and teste it. Please give me feedback, to know if any bug, etc Dld: http://robots-txt.info/Robots.txt-Generator.rar
What is the purpose of this? Is the sitemap URL set into the box and a .txt file generated to upload to site's server?
Yes. You need to upload robots.txt to the baseurl on your server (your domain, not in a folder, but in the main path). This let you control which bots or spiders are allowed to index which part of your site. Also you can control the crawl rate. This can be done up to the 300+ bots in the database. In fact you can search for some specific bot that you may detect on your log.
thanks just downloaded but one questions is this version fully bugg fixed ?? and robots.txt shd be placed in www ,public_html or to the root??