Yes there is still some duplicate content but Its a lot better. They end up with 10x the amount of links.
You can do that if you change it enough, but with that amount of work you might as well write a new article and use new keywords and new anchor text.. You can build your keywords placement for a lot more keywords that way.
Exactly, so you effectively have 50 duplicated articles out there, right? I just wanted to point that out, as I didn't really understand the meaning of your post - are you saying duplicate content is bad?
Duplicate content is fine as long as its on other peoples websites. Publish unique content on your site and you are fine.
Duplicate articles (on other sites) is bad only in the sense that it won't benefit the company your posting them for. So the more unique articles you post the better. If for example you post one article to 500 directories you'll see a surge in back links and then you'll see a significant amount (but not all) drop. So the more uniques you submit the more back links you get in the long run.
So, why do you submit 10 unique articles to 50 article directories each? Why not just 10 unique articles to 10 article directories? Because, as you said, "Duplicate articles ... won't benefit the company your [sic] posting them for." I don't agree with the last statement - I think the additional articles - albeit duplicates - can benefit your customers. Apparently, you think it benefits them also or why would you do it? I like your process - 10 unique articles sprinkled across 500 article directories - I am just confused when you say "dup is bad".
Submitting to article directories is part of my link building campaigns and I do not not specifically expect traffic but actually... backlinks It is like you wanted people to visit web directories, search a needle in the pile and visit your website... very few chances if the efforts are at a level of ... 5 ... 10 .. articles... Overall, depends to the quantity and the quality of the articles that you post and to the area of industry that you try to infiltrate. Best regards
So can someone clear up a related question for me pleaseeeee If i was newb looking to hire the services of someone to write an article for my site and then manually submit to the article sites - What is better for my site with the view of generating backlinks longterm? (Assuming the article is written well of quaility and related content to targeted site) Also given fact im dont have the biggest budget - whats best bang for buck as a starting point for this method ? a)1 article submitted to the main 20 article sites by pr b)1 article submitted to mix of 150 article sites c)1 article submitted to over 700+ article sites d)Muiltiple (3+) articles to just the major top pr article sites ie. 5-6 e)Im way off - some other approach all together You know what i am getting at - 1 article to heaps of sites or a few different articles to less sites? I guess, similar to the whole question on the value of duplicate content and back links. Love to get some answers guys - would be a huge help thx
Duplicate content is not applicable to article submissions, its a whole different subject. The only time it might be is if you publish the same article on your site as on hundreds of other sites. Even then if your site is trusted it will outrank the others anyway. Having the same article on hundreds of other website is not what duplicate content is about.
Like I said before, you lose a certain amount of links because the search engines see it as duplicate content but you dont lose all of them, you keep a percentage. So submitting 500 unique articles to all 500 directories would be overkill. The more uniques you submit the better, but I've just found 10 to be a good middle ground.
I beg to dissagree. The search engines don't differentiate with duplicate content. They don't see it on an article directory and think "oh this is okay, this is an article". While publishing the same article to hundreds of article directores won't hurt yourself, the search engines still see duplicate articles on other article directories and devalue the links within the article (or remove them completley)
The duplicate content filter simply means that if Google sees lots of pages that are the same it will filter some of them from the search results for a particular keyword. It has nothing to do with backlinks.
I have never heard that duplicate content had any relevance in regards to backlinks value in the engines. Can you point us to some references or possibly an example where you have seen this? It would be most helpful. The reason I ask - many of my backlinks are from scrapper sites. Based on your theory, those backlinks should not be showing up.
I'm gonna say the answer to this question is a variation of D The reason I'll say this is because I side with Search-Placement. I had 126 links from an article I submitted earlier last month, now I have 46. Who knows how many I'll have in a month. The problem with article submissions is that they are most often republished by sites that are one step up from scrapers. As a result, most of your articles that get republished never become quality backlinks... in fact, most of the time they end up in the supplmental index. However, I still believe that they have the ability to drive traffic. The reason why i say that its a variation of D is that I wouldn't submit based on pagerank. I would submit based upon Yahoo Domain backlinks. The more Yahoo domain backlinks an article submission site has, the more their articles have been republished. The more articles you get republished, the more traffic and greater potential that a few of those republished articles might even turn into viable links. Here is a list of 80 article submission sites that already have the Backlink (and PR) stats attached to them.
One thing people forget is that when you are submitting to an article site you are giving everybody a free article for them to publish on whatever blackhat scraper adsense p0 rn website they want. As long as they keep your link in place. If you think submitting to 5 sites is better than 200 sites then thats your call - just remember that it might also appear on 5000 scraper sites inside a year.
That the whole idea. But if 1% of those scraped pages acquires some page rank, then you'll have 50 pretty good links.
Thats my point - any link can be good even if its got the same content as 5000 other pages. Imagine if Yahoo picks up your article, is that a bad link just because its got duplicate content on it?