I have one sub domain which is having robots.txt file as follows User-agent: * Crawl-delay: 30 # Current Standard way to delay bot hits Request-rate: 1/30 # Extended Standard to set bot hit rate Visit-time: 1500-1700 # Extended Standard to set bot hit time range in UTC Disallow: /javascripts/ Disallow: /stylesheets/ Disallow: /images/ and at last sitemap url Can any one explain me about crawl delay, request rate and visit time.
Robots.txt is one of the best way to guide the crawler how to look to the website. Crawl-delay, Request-rate, Visit-time are one such guide to rule the crawlers. Crawl-delay: It defines how many seconds to wait after each succesful crawling. Request-rate: defines pages/seconds to be crawled ratio. 1/30 would be 1 page in every 30 second. Visit-time: you can define between which hours you want your pages to be crawled. Example usage is: 1500-1700 which means that pages will be indexed between 03:00 PM – 05:00 PM GMT.