I am wondering about the duplicate content filter... If I was to get an article off an article website that has only just been published, I am likely to be one of the first to get the article indexed on my site. I suppose there is even a small chance that I could have it indexed first. Say the article onmy site got indexed 2nd or 3rd, does the page get less of a battering by the duplicate content filter than if it was the 1000th? Or, if you are not indexed first do you get teh same treatment regardless?
I don't know if they care who gets indexed first. I have also heard teh site with higher PR comes out on top, but honestly I don't know. All my content is original so I don't have to worry about it If I were you, I'd try to avoid the dup content penalty by maybe putting two articles sideby side on the same page, or putting only excerpts of the articles (several per page) with a link to the original location or something like that.
I've seen ezinearticles ranked top 3 on the airfares market with second hand dup content articles. Yes in google. So there are factors across dup content.
As long as you incorperate other text on the webpage duplicate content shouldnt be that much of a problem as far as I am aware.
ya as long as you add enough other text on the page you should be fine. but you need to put enough to offset the dupe content. I have seen ppl add rss feeds and other random text to make the page look unique.
User comments also is a good idea as long as you can make the visitors post a comment. I never really managed in this...
To avoid problems, you can add RSS, and also a small section under the articles with related articles...
Thats a good idea, although being how I am I can see me also not getting round to implementing it! I also like the idea of a related articles bit, now that I think I could manage! Im not particularly well up on the whole rss thingy.. Thanks for the ideas guys, will be looking to do something about them shortly