We want to create a robots.txt for our site for FAST Enterprise Search to Crawl accordingly? To specify the locations to crawl and the other to not, we would like to know what all Robots.txt directives FAST supports. Does it support 'Allow', 'Request-rate', 'Crawl-Delay', 'Visit-Time ?