I have three URL patterns 1)www.mysite.com/index.php?id=(.*)&go=(.*) 2)www.mysite.com/index.php?go=(*)&id=(.*) 3)www.mysite.com/index/php?id=(.*)&go=(.*)&start=(.*)&offset=(.*) I want allow users to crawl the first two but not to allow the links following it. I have to disallow the third pattern. Basically it is allowing the first page and disallow the second page. How can i do this in robots.txt? My robots.txt is : User-agent: Googlebot Disallow: /index*start User-agent: * Disallow: /index*start It will block the third site. But How can i write Allow directive here so that i will allow bots to crawl in the first two cases. regards kiran