Anyone have suggestions as to how to deal with syndicated, near-duplicate content that I want indexed? Not for the sake of spam but: Assume I have 2 sites, one that covers restaurants in Los Angeles, another Hollywood. They overlap; consider Los Angeles as having everything. Because of so much content (a dynamic index), I can't dedup (automatically or otherwise) and direct searches for the stuff in Hollywood to that domain so the Hollywood and LA stuff is being indexed. At the same time I can't redirect as I want folks who come into the LA domain to stay there, even when find a restaurant in Hollywood. I had in mind using priority to say everything in Hollywood.com is higher priority than everything in LA. The engines would crawl both but I've told them the Hollywood stuff is more important as far as they are concerned. Would this work? Since it is only 2 overlapping domains should I even care, is the duplication that big of a deal? Any other suggestions?? Thanks!
Generally, duplicate content gets tracked and, while it is not penalized, you might get a lower PR for the site that has duplicates. I also heard that Google is working on algorithms that would prioritize original content; thus, there is a prospect that sites with mostly syndicated content will suffer. However, there is a way out - editing or arranging content in an original fashion seems to work (at least for me); similarly, outfitting your Hollywood pages with a custom blurb that would make the content seem sufficiently different could also work. Hope this helps.