If I write an article and post it on my website, then submit that article to ezinearticles, will there be any issues with duplicate content? I honestly didnt even think about it and placed it on both. Is this ok? Will google stomp on me for this?
Well i believe it will be an issue for both ezinearticles as well as google. Because Google Search Engine does not crawl duplicate contents and that will be discouraging for you website traffic.Secondly submitting it in ezinearticles means public access of article for people and it may be a possibility that they republish it in their own blog/website.
I see. Well luckily it had not been approved yet, so I moved to to a pending state until further clarification. Anyone else?
All you need to do is a quick rewrite of the article and then submit it. Change the headline and rewrite the first few sentences of each paragraph. This can all be done inf 15 or 20 min max. Dont just change the order of the paragraphs actually change verbs and adj. etc and entire sentences. After that you are good to go.
I have heard that Google will favor the article on ezinearticles, even if you post the article on your own site first. They basically trust ezinearticles more and it has more authority. So the article on your site may not get indexed. authoritydomains' advice is probably a good way to get around the problem.
if it's using your site feed & it's working as a social network then I don't think it will effect your site , you will not get any plenty for it !
Go with authoritydomains.com's advice. @prakash86: No, it's not true, that google will "not crawl duplicate contents" (sic) It will crawl, but it may decide to list only one of the two sites, or rank one significantly higher that the other. It will try to list the first source higher, but it may be that this will not work due to the higher trust (or authority) the second site has. I'm sure you meant that.
I believe I have answered this question. http://forums.digitalpoint.com/showthread.php?t=1009643 I directly quoted from Google that they choose which to rank higher. Also, a guy here gave an idea on some items being indexed so long as they give credit. Much like Wikipedia.
So long as they give credit? That's an interesting thought. How many humans are actually part of the process....or is that also automated?
I think it is more human than automated. Google uses both. As we can remember Wikipedia is a conglomeration of information from different sources. I don't think it is duplicate content on a per word basis though. Still, it is an interesting theory worth trying out.
If both the articles are same with zero change, then it's possible that either of them would lost in Google. Possibly yours because ezine is big and popular site, i guess more than your site.
Ezine Articles would not publish an article which has already been published elsewhere, in your case it is on your website. With duplicate content, there is no penalty as most people think. All Google does is ascertain the original source of content (based on factors) and shows it for relevant searches. If you publish the article on the website, make sure that the article is cached before you start syndicating it. While syndicating make sure to link back to the original source of the article which would be your website.
I don't see how Google can penalise your site for duplicate content on another site. I have no control on whether someone nicks my articles and sticks them on another site so why should I be penalised for it? Duplicate articles within my domain - yes, penalise me. But not if someone has pinched one of my articles and stuck it on a website that I have no control over.
I suggest you only submit your articles to article directories, after google indexes your articles. That way, your site will get all the credit for the new content.
Ok, I think Ill try waiting for google to index the article, then submit it with some minor changes and a reference to the original article. Thanks for all your input.
I asked ezinearitcles about having the article on my site as well as submitting it to them for publication. They said no problem and it was fine to do. Here is Googles' take on duplicate content: "During our crawling and when serving search results, we try hard to index and show pages with distinct information. This filtering means, for instance, that if your site has articles in "regular" and "printer" versions and neither set is blocked in robots.txt or via a noindex meta tag, we'll choose one version to list. In the rare cases in which we perceive that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we'll also make appropriate adjustments in the indexing and ranking of the sites involved. However, we prefer to focus on filtering rather than ranking adjustments ... so in the vast majority of cases, the worst thing that'll befall webmasters is to see the "less desired" version of a page shown in our index." Zeek