My website is dynamic and two or more pages exits with the same content but different urls. How to solve this issue ... Should i block each and every individual url with robots.txt? Please tell me if there is another way. ?
all these urls are accessing the same page, so when i refresh content, it will be refreshed on all duplicate pages.
Hi I do not know what is the problem. I have it as well on one of my sites...Regarding Robots.txt I had changed my description and this is why I had the problems, but then I have added the links to robots...But it takes more then 3 weeks until the new update....At least this happened for me. Still have 2 Not found items...
You have several options available but before I go there just ignore most of the garbage that's been written here. *Fresh content won't help you current situation. *There is no such thing as a penalty for duplicate content. Option 1: As already mentioned, use a 301 redirect on the url's that you don't want and point the dupe url to the preferred url. Option 2: Use Google webmaster tools and select a preferred url (this is only useful if the problem is between a www and non www version of your site). Option 3. Use the rel=canonical element. You can read about how to use that on the Google blog and webmaster guidelines. Good luck, I hope you sort it out soon.