Duplicating the content is considered as a evil by the search engines especially by Google. All the efforts are to give a unique and efficient user experience. My doubt is that is it technically feasible to automate the content comparison for each website against all the rest ie to all the seven other million websites on the planet. I think the probabilities to find the Copied content 1. user reporting ( that i have some experiences in the case of google) 2. comparison of the pages included in the top SERP. Looking forward for a clarity in this issue.
^^ Ya Copyscape is good at catching Content Duplication There is no use of Duplicate Content (Copied)
copyscape is very popular but they are too strict on testing. An easy test is to copy the content and search in google. Just make sure that you put th etext within quotes.