Content mix, what thinks Google about it

Discussion in 'Search Engine Optimization' started by frozzy, Jan 30, 2010.

  1. #1
    Hey
    I have a question.

    When i use(mix) 60 % original and 40 % not original content on my site, how rates these things google ?

    I mean for example 3 articles original content and 1 not original.
     
    frozzy, Jan 30, 2010 IP
  2. christoph

    christoph Active Member

    Messages:
    279
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    55
    #2
    Theoretically, all the non origional content will not get indexed.

    Spend 10 mins making those articles unique and that sovles all your problems, google will index all your pages, they will like you far more for it. They will see you as a unique site and rank you higher as a result...
     
    christoph, Jan 30, 2010 IP
  3. peterson82

    peterson82 Peon

    Messages:
    228
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    There is one issue occurring regarding about keyword stuffing and duplicate content.so make your content to differ from other original article.
     
    peterson82, Jan 30, 2010 IP
  4. frozzy

    frozzy Peon

    Messages:
    247
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Actually i asked, what Google thinks about this and how it rates?
    Which effects i could have ?
     
    frozzy, Feb 3, 2010 IP
  5. selectsplat

    selectsplat Well-Known Member

    Messages:
    2,559
    Likes Received:
    121
    Best Answers:
    0
    Trophy Points:
    190
    #5
    It appears that Google has taken a much stronger stance in regards to duplicate content that it used to. Since the beginning of the year, I've actually seen a few sites de-indexed due to what I believe was too much duplicate content. To be safe, I recommend having your site be 100% unique.
     
    selectsplat, Feb 3, 2010 IP
  6. frozzy

    frozzy Peon

    Messages:
    247
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    okey, for example I have competitor that is higher than me.
    and i wanna be higher than he.
    i copy his content put on the page, that was created earlier than his.

    How Google will determine that this is his content?
     
    frozzy, Feb 3, 2010 IP
  7. selectsplat

    selectsplat Well-Known Member

    Messages:
    2,559
    Likes Received:
    121
    Best Answers:
    0
    Trophy Points:
    190
    #7
    Google can tell when the content in question first appeared on the internet. They probably archive it, similar to what archive.org does.
     
    selectsplat, Feb 3, 2010 IP
  8. Lex350

    Lex350 Notable Member

    Messages:
    1,889
    Likes Received:
    31
    Best Answers:
    0
    Trophy Points:
    215
    #8
    This is bad. You could make it worse for you site if Google finds out about this.
     
    Lex350, Feb 3, 2010 IP
  9. Canonical

    Canonical Well-Known Member

    Messages:
    2,223
    Likes Received:
    141
    Best Answers:
    0
    Trophy Points:
    110
    #9
    Google doesn't just use which copy of a piece of content they indexed first to determine which is the original and which others are duplicates. This would be an unfair way to distinquish originals since different sites have different crawl rates.

    If a site that gets crawled once per month writes a piece of content and publishes it and then another site that is crawled several times per day copies the content and republishes it then the site that gets crawled once per month would likely never get credit for their own content.

    While first indexed date/time might be one factor they look at, they are also looking at other likely more important things like which version of it is most linked to by other sites and/or has a higher PR. One way to help Google figure out which is the original and which are duplicates is to include a link in the article to itself. If others are scraping your site or getting your posts/articles/content via RSS feeds then having links to the original from the copies will increase the odds that the duplicates will link back to the original. Google can use this to determine the original. Matt Cutts has suggested this on numerous occasions as a way to help Google determine original from duplicate content especially with syndicated content like RSS feeds...
     
    Canonical, Feb 3, 2010 IP
  10. Traffic-Bug

    Traffic-Bug Active Member

    Messages:
    1,866
    Likes Received:
    8
    Best Answers:
    0
    Trophy Points:
    80
    #10
    The non-original content part, if detected, will find its way to the supplementary results. Try spending some time to rewrite the non-oriignal parts to look and sound more like original.
     
    Traffic-Bug, Feb 3, 2010 IP
  11. seo555

    seo555 Peon

    Messages:
    1,035
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    0
    #11
    this is not good way to u optimize your site bcz you copy other your competitor website content that time google gives u good ranking but after 2 or 3month your site in prob
     
    seo555, Feb 3, 2010 IP
  12. 1associate

    1associate Peon

    Messages:
    1,136
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    0
    #12
    I write and then publish the same article - just change the title and add a new introduction paragraph sometimes - in several places. All show up on google SERs - I've a few keywords which bring up a front page dominated by my articles in different guises. This is despite the content being duplicated and therefore no longer original.

    I don't pinch other people's content - people who do that are thieves and should be banned from doing anything online... I digress
     
    1associate, Feb 4, 2010 IP