My web site is getting several links from other sites with some parameters with the reference to their sites. For example http:// mysite.com/?ref=www.agothu.com. These links are getting indexed in search engines and are causing unnecessary crawler access. How would I block certain urls with parameters from crawler's access using robots.txt. For blocking I used Disallow : /folder/ But I don't know how to block urls with parameters. Can some one help?
I think use of robot.txt in a right way is beneficial for you. you may find your answer on this link. check it. http://eadvertisements.blogspot.in/2012/03/removeblock-unwanted-pages-using.html Thanks.