Is there an easy way to see what pages are seen as duplicates? For example a tool we can goto, type in our URL and it crawls the page to return duplicates? Thanks!
Is this what you're looking for? http://www.copyscape.com/ Cheers, AllanJames (the StartBusinessMentor)
I thing the best you can do is to get a phrase from the text and to place it in any search engine... lets say Google. You'll find pages that has it in their code. It is different however with the image and video files. This one - "copyscrape.com" does not work!
I mean something where I can just enter a domain and it will check subpages too. For example if you can get to the same page through 2 URL's because of URL rewriting being broken.
I don't think such a tool currently exists. I agree with fcolor's strategy of just searching for a phrase from the text. Using quotes in the search may help too.