I've seen duplicate content land pages in supplemental results, i've also seen pages with poor linking land in supplemental.
Also if you use the same text in meta description tag. Sometimes its better not to use the tag at all.
I have one site that is mainly in supplemental - it has to be due to lack of inbound links to those pages as the content is original and useful for the intended people... most links into the site focus only on a few pages that lead to relevant sections ... list of links within the section. When I launched it originally it done well, and instantly attracted a lot of scraper sites and then it disappeared into supplemental hell ... it continues to stay in supplemental and as a result most of the scraper sites now leave it alone. I am just hoping the scraper sites that linked then de-linked from it didn't impose any artificial link penalties to the site as it was beyond my control and is one reason I am not a fan of sites that simply scrape content.
I totally agree with you on this. I have a site which is about 75% supplemental results. I did make a big improvement though because everything but the homepage was supplemental before. The way I got my pages out was to make sure each page had a unique meta description tag and a unique title. Also, each page had a template which looked sort of like: Name: City: Date:, etc etc. I took the template out and just left the fields in. Would google consider something like that as being duplicate content? I took it out to be safe but its something I wondered about.