what type of query i use in robots.txt file in this situation? example two url 1. http://forums.digitalpoint.com/showthread.php?t=1111474&some-dummy-text-here-in-url 2. http://forums.digitalpoint.com/showthread.php?t=1111474 both url result are same. mens this are duplicate content. now i want to stop all url like url 1. or you can say google index 2000 url like url 1 and url2 now i need some query to stop indexing url 1 is it possible to write this type of query?
That is for sure how you do it. If you want to see what google has to say about follow this link http://www.google.com/support/webmasters/bin/answer.py?answer=40360&hl=en I have used this method to get rid of over 700 pages of so called duplicate content. Check out the link it explains it pretty good.