Hey I have a question. When i use(mix) 60 % original and 40 % not original content on my site, how rates these things google ? I mean for example 3 articles original content and 1 not original.
Theoretically, all the non origional content will not get indexed. Spend 10 mins making those articles unique and that sovles all your problems, google will index all your pages, they will like you far more for it. They will see you as a unique site and rank you higher as a result...
There is one issue occurring regarding about keyword stuffing and duplicate content.so make your content to differ from other original article.
It appears that Google has taken a much stronger stance in regards to duplicate content that it used to. Since the beginning of the year, I've actually seen a few sites de-indexed due to what I believe was too much duplicate content. To be safe, I recommend having your site be 100% unique.
okey, for example I have competitor that is higher than me. and i wanna be higher than he. i copy his content put on the page, that was created earlier than his. How Google will determine that this is his content?
Google can tell when the content in question first appeared on the internet. They probably archive it, similar to what archive.org does.
Google doesn't just use which copy of a piece of content they indexed first to determine which is the original and which others are duplicates. This would be an unfair way to distinquish originals since different sites have different crawl rates. If a site that gets crawled once per month writes a piece of content and publishes it and then another site that is crawled several times per day copies the content and republishes it then the site that gets crawled once per month would likely never get credit for their own content. While first indexed date/time might be one factor they look at, they are also looking at other likely more important things like which version of it is most linked to by other sites and/or has a higher PR. One way to help Google figure out which is the original and which are duplicates is to include a link in the article to itself. If others are scraping your site or getting your posts/articles/content via RSS feeds then having links to the original from the copies will increase the odds that the duplicates will link back to the original. Google can use this to determine the original. Matt Cutts has suggested this on numerous occasions as a way to help Google determine original from duplicate content especially with syndicated content like RSS feeds...
The non-original content part, if detected, will find its way to the supplementary results. Try spending some time to rewrite the non-oriignal parts to look and sound more like original.
this is not good way to u optimize your site bcz you copy other your competitor website content that time google gives u good ranking but after 2 or 3month your site in prob
I write and then publish the same article - just change the title and add a new introduction paragraph sometimes - in several places. All show up on google SERs - I've a few keywords which bring up a front page dominated by my articles in different guises. This is despite the content being duplicated and therefore no longer original. I don't pinch other people's content - people who do that are thieves and should be banned from doing anything online... I digress