I have been doing a lot of articles and submitting them to various directories. I also did some spinning and such. But recently I read an article from a well respected individual that said in deed, Google does not slap you for duplicate content. Does anyone have any other input on this? Thanks much.
Here's the deal. Google will not rank two pages in the same website with the same content. They'll only rank one. This doesn't apply to article marketing. (Think of a press release. Once it gets syndicated, the exact same content is all over the place but there's no slap.) If you submit the same article w/o spinning it to 300 article networks, google will index them all at first. Then over time they will filter out the ones that come from smaller article networks that don't have much pagerank or link juice. All of your higher quality links will stay. Spinning and rewriting your articles can minimize this some but if you're submitting lots of articles there's really not much of a need. As long as you're submitting good stuff you'll be ok.
I wish I could agree that there is a duplicate content penalty but I no longer believe it completely true. An article of mine that people keep stealing and posting verbatim has shown me that the copies do indeed get indexed, some quite well. (It's how I find them.) Thankfully, my article has been online long enough that so far Google consistently ranks it higher than the stolen copies. I suppose one could argue that the duplicate content penalty is imposed by placing the copies further down in the SERPs - not good enough as far as I'm concerned. Seems to me if the Google bot can figure out mine's the original it should also truly penalize the duplicates by not indexing them. Another example...I was researching something in the heavy equipment industry for a client and I must have seen the same article well-indexed for at least 20 different sites. Google always talks about enhancing the search experience. How having all top 10 results in the SERPs point to the same exact article on 10 different sites is helpful escapes me but apparently Google seems to think it does. Obviously no penalties there, all of the copies were indexed just like they are with my article.
What YMC said. Google's spiders may be programmed to ignore pages that have duplicate content (Is there a 100% certainty either way?), but it isn't fullproof. To this day, I haven't found one updated quote from one of Google's engineers, Matt Cutts, saying either way. When in doubt, just don't do it. Period. Me, personally, I could care less about search engines. (It's not a guarantee you will rank high anyway.) I'll just market the page until it bleeds. That's where the real traffic is. Link, please.
I have written articles and put them in ezine and places like that, they always get copied and pasted in other sites google indexes them because I find them in google. But if I take a sentence from the article and put it in google the original place the article came from will always come up first in google and usually with 3 listings and "Show more results from ezinearticles.com(or what ever site)". So they get indexed but they don't have any kind of juice like the original page has.
I thought that if they found out that you had duplicate content you suffered an additional penalty, like you start sliding back in the rankings.
Google does not slap your for duplicate content because upon writing a content you must already known the guidelines of which is right and wrong in publishing contents.
Duplicate Content danger is on your site, like duplicate title. Duplicate Content with another site "No problem"
The answer is obvious to anyone with ever 3 neurons. You are a search engine. Your 'product' is your search results. What would you like to show your user today. (1) A page of links to the exact same article on 10 different websites. (2) 10 different articles all of which suit the search terms. The fact anyone has to even ASK this question shows how unprepared people nowadays are to THINK.
contentboss, you are thinking like someone who actually uses the search engines as opposed to the people who run them. It's been my experience that the search engines are so intent on indexing the entire web that they don't mind a bit if that means serving a "page of links to the same article on 10 different websites." I see it over and over again in my research - particularly for clients working in highly specialized fields with smaller audiences.
add 1 neuron. When there isn't anything else, then yes, they will show you a page of identical stuff, because it's better than a blank page. Otherwise, and they are all upfront about this, search engines all want diversity in their results. I don't understand why people have such a hard time understanding this. Go to google. search for credit cards. Show me the same article on page 1. Acne? Dog training? Cheap flights? Not a frikkin chance. Sure, you can find it for very long tails, such as extracts from song lyrics, but not for anything that can be used to make money, which, I assume, is why everyone is here. duplicate content penalty explained
Ah, but the niches you list are among the most competitive on the web. I would think the body of written words on topics like those is immense. It would seem to me the topics where money could be made are the ones where there is so much duplicate crap floating around. I would think if someone started a site in those niches that actually had unique and useful information that they could make money. BTW, the people working in the niche I am referring to sell equipment for hundreds of thousands of dollars - they certainly think it's something they can make money from, why else would so many sites be using the same news articles? I read your article. I wish the people who keep stealing my article did actually get penalized. When my original article is continually displayed in the results only a few positions higher in the top 10 than the copies, I see very little signs of any penalties. It's an article about pets, so it is in one of the high competition niches. When someone posted the article in it's entirety on Yahoo Answers and the <big giant pet food company everyone has heard of>.com website pulled the Answers page onto their site, their site briefly outranked mine. So, they posted a copy of a copy and actually outranked the original because of the strength of their website. My little dinky pet site didn't have a chance against an international powerhouse like them. Fortunately, when Yahoo pulled the offending piece, it automatically came down off the pet food site too. So, you can quote Cutts or make fun all you want but at the end of the day I've seen no penalties for the thieves until I find them and send DMCA notices to them or their webhosts. I've lost count of how many hours I've lost chasing these jerks. And while Yahoo and MySpace have been quite responsive to my requests, each and every time someone posts my article I have to send a separate request. I bet I've already spent 2-3 weeks of work time on chasing the thieves just to protect my copyright and site. Maybe your experience has been different. But, from where I'm sitting, the only penalty for duplicate content is when a site visitor leaves the site because it's got nothing new to say or the true owner requests the content be removed. And, if the offending content is in China or India - you're pretty much screwed and can only hope your site maintains a stronger presence on the web and continues to outrank the thief. Google's hunger for new content has created this mess and only they can stop it. Seems like when the whole splog thing started they might have actually moved to do something. But, again from everything I've seen DMCA notices and other legal threats are the only true penalty. When I talk to writing clients about using duplicate content, strangely enough, most have heard of this duplicate content penalty thing with Google. Few seem to understand the whole copyright infringement aspect of things and how prospective clients might lose trust in them. They seem to know the myth better than the reality. Go figure.
I've only seen it a few times in X amount of years, but, like YMC, I have seen the same content ranked high. I couldn't believe the shit, but there it was. It rarely happens, but.... The last time was when I saw the second piece near the bottom of page two. There was no mistake, there was no "you just thought it was the same piece." It does happen. I would think that Google does indeed ignore duplicate content. They would have to, for the obvious reason. But it is obvious that they have glitches in their programming. I don't know why so many even disucss it; worry about it anyway. When in doubt, just don't do it. Period. What's so freakin' hard??? But, many savvy webmasters will go ahead and put in duplicate content. Like using syndicate articles. It could be a damn good piece for their readers to enjoy, thus getting them to come back, time after time. That is FAR more valuable than overrated organic traffic. Also, if the webmaster of the origial piece deletes the page, lets his site slide downhill, or whatever, it will drop from Google's database. (This is one reason why spiders come into our sites all the time.) The person who has that same piece and who is also collecting a buttload of backlinks to it will rank higher and higher, while at the same time getting more readers from those websites that has the link.
Someone PLEASE post up a link to a google search for a topic that can be used by the average Joe to makes some money that also has a page full of dupes. Please. Oh... you can't. And btw YMC, you just repeated my point about your client with the hundred thou product. Google is happy to display dupes when there isn't anything else because ... (for the umpteenth time) it's better than a blank page, i.e. IT IMPROVES THE FRIKKIN USER EXPERIENCE You guys really don't get it do you? Google wants the best product it can show. Deduping generally improves the product because it introduces diversity. Diversity is what users want. Users come back to Google. Google makes money. It' possible to show exceptions because this isn't a perfect world, in case you hadn't noticed. But generally search engines dedupe nowadays because it improves the user experience Anyone starting to notice a theme running through this? user experience. No?. Ah well, never mind. Keep posting your dupes, kiddies. You make life easier for the rest of us. Sheesh. It's like trying to explain the rules of cricket to my goldfish Billy.
People seeing the rare occassional duplicate piece doesn't mean a ''page full of dupes." It means a duplicate has gotten in through the cracks. Google's computer system is not infallible. Shit happens. YMC, I don't know...maybe next time you should post a link to the pages that has the duplicates.
Well said. Google's system is now so huge and wooly that probably not even THEY know how it works anymore. All WE can do is look at the results as they come in and draw conclusions from it. For some considerable time now, anything attracting an adwords bid of bigger than a couple of cents has been deduped (to the best of their abilities). This is actually a good way to determine how hard Google are working on a term - no adwords means no competition, means probably not a lot of diversity for it, which in turn means you're likely to see dupes. I remember some luddite on 'the other forum' trying to 'prove' that he had a great term that showed a page full of links to his article. Problem was, the term got 2 searches a month, no one was adwords bidding on it, and there was no competition. It was like... 4 words or something, and because the first 2 words were 'internet marketing' he thought he'd somehow disproven the dupe content reality. Look, he said, 30 million competing pages! If you put the term in quotes, there were 11. That's eleven pages, not eleven million. This is newbie stuff kids, get with the program. Edit - btw, it's not just Google/ "Bing removes pages with duplicated content from search results. Although duplicated pages aren't always removed from the index, only those with the highest ranking are included in results. "