Hello I have a script that tracks downloads. The URL is in this format : http://www.site.com/dl.php?var1=string&var2=string What is the correct command to restrict bots from spidering that file using robots.txt ? Will "Disallow: /dl.php" be enough ? Thanks
>> Will "Disallow: /dl.php" be enough ? Yes. That will disallow *any* file beginning with the string "dl.php" (without the quotes).