Well, I had this duplicate content problem on my website and now I think i fixed it by adding the robot meta tags. How do i tell it to re index all my pages?
Usually, A few backlinks should solve that, But if that doesnt happens, It might require a QDF(Query Deserves Freshness) Step. Content freshness can be tried upon. Signup up at webmasters console at Google, and block the URL's to be cralwed, or if already indexed get them removed using the URL removal tool (if any). Submit a revised XML sitemap and combine it with a few backlinks pointing to all the pages (1 atleast to every) Shud help..
Well my website is a forum, it has too many pages indexed and yeah... I can resubmit my website sitemap but I don't think that will remove all the duplicate urls that google indexed.
You could have phrased your question in a proper way then Your best bet would be Google's URL removal tool, Sign in on ur Webmaster console, Identify the pages with duplicate content, and get them removed from Index. http://services.google.com:8882/urlconsole/controller Sign in and get rolling