I know google can pick up text content as duplicate, ie. the same news story or something. But what about a collection of links? I ask because lets say you use 10 social bookmarking websites, and you don't put descriptions to keep away from duplicate content. But across 10 different social bookmarking sites, the same 10 links are found. Do YOU believe google would consider that duplicate content, or do they need more then links? Clay
Probably not. Google will not penalize your contents/links so easily. Because there are millions of websites who get backlinks from different websites with same tags and anchor texts. Websites like Google itself or Yahoo or Joomla or CNet, Or MSN never gets anchor texts or texts or tags like - 'best search engine', 'free email provider', or 'open source cms'. Rather they get just the domain name as anchors or tags.
Not duplicate content in my experience, however, I would suggest you very the anchor text between different properies. It would be to your benefit. Best of luck.
yeah duplicate always get trouble for sites, So should be try to get link from current content , where wish you want place link.