Out of experience I've never noticed a major change, but obviously the page gets ranked lower for having duplicate content. If a site has a lot of pages with duped content I can only imagine it would have an effect on the overall site' ranking.
Good point JoPlumber - that's probably why we see Google updating its algorithms almost on a daily basis. You bring up another good point - the process of actually identifying duplicate content. This can be quite difficult when each page is given a unique identity according to a group of keywords, or keyphrase, it is associated with. I think the only way for an original document to rank higher than a duplicate document is if the two are virtually identical, and are hosted on similarly ranked websites in similar industries. If the person duplicating your content adds a significant amount of text, the identity of the page will change slightly, and it will be handled differently by Google's algorithms. Unfortunately, if a powerhouse website decided to nab content from a small niche player, the website's overall presence in its industry/ category will be taken into account as well, and may affect treatment of the duplicate content.
Thanks for all your comments guys, I think I may just try and do as much unique content as possible, which is what I have been doing for the whole time i have been working on websites, so I guess no change lol, just wanted to search new ideas.