Okay first of all, I am pretty sure how this works. When a search engine finds duplicate content on another site, the sites rank goes way down right? My problem: I am soon going to open a video game cheats site. Video game cheats are pretty much the same as all the other sites. What will happen to my SEO? Is there any way that I can have search engines know that I have cheats for a certain game, but not index the actual cheats? Any help would be awesome before I open the site. Thanks.
You'll need to provide other unique content and web structuring on your website to prevent duplicate content. Overall cheats are probably different per website and displayed different; I'm sure with proper formating you can make them unique.
Then don't take my opinion on the subject, since you apparently know so much about whats properly considered duplicate content in the first place... or do you?
Okay, Your right, I'm the "newb" in the matter. Sorry about that, I just wanted to be 100% sure that search engine status would not be tainted for my new site.
Duplicated content isn't as simple as a "small phrase" considering the probability of finding the same 4-5 words in a row on different web pages is quite possible.
Run it through copyscape premium there are free services but they are not that good., it will tell you if it dup or not... a dup is a dup...
Duplicate isn't just about the actual words on your page. You need to take into account your page structure too. And two or three words will always be same. They look at big chunks of content.
You need to have other content that changes on a page by page basis. For instance under every cheat, have links and a summary of other related cheats by the same game manufacturer and that sort of thing. Things that are generated dynamically on a page by page basis. Simply having a different menu system then other sites, or a paragraph of copyright text in your footer isn't going to cut it. Google is working on and improving a boiler plate algo system that ignores elements that appear sitewide so they ignore menu systems, footer copyrights etc for scoring and look at just the page content.