Popular Blogging Platforms Suffer Search Engine Penalties

Discussion in 'Blogging' started by Caydel, Oct 20, 2006.

  1. #1
    Hello - I was just thinking about some issues in popular blogging platforms, and wrote a decent blog post on it. If anyone is interested, you can read it at:

    http://www.infohatter.com/blog/popular-blogging-platforms-may-suffer-search-engine-penalties/

    if you like it, feel free to digg it at: http://digg.com/tech_news/Popular_Blogging_Platforms_Suffer_Search_Engine_Penalties

     
    Caydel, Oct 20, 2006 IP
  2. ahkip

    ahkip Prominent Member

    Messages:
    9,205
    Likes Received:
    647
    Best Answers:
    0
    Trophy Points:
    310
    #2
    so all of the ten location of that post 've got index by google?
     
    ahkip, Oct 20, 2006 IP
  3. Caydel

    Caydel Peon

    Messages:
    835
    Likes Received:
    47
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Yes, they have all been indexed from what I can tell. It's not the indexing factor that I am worried about - it's whether Google will rank the post as high as it would had there been only one copy of the post indexed on my site.
     
    Caydel, Oct 20, 2006 IP
  4. abdussamad

    abdussamad Active Member

    Messages:
    543
    Likes Received:
    17
    Best Answers:
    0
    Trophy Points:
    60
    #4
    I don't think it works that way. The content is not really duplicate because the SEs are looking at the whole web page. So though your post shows up in multiple places, each webpage also includes a lot of other posts that make it different from the other pages.
     
    abdussamad, Oct 22, 2006 IP
  5. Caydel

    Caydel Peon

    Messages:
    835
    Likes Received:
    47
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Respectfully, I have to disagree. A post of 200-300 words available in multiple places will definitely trip dupe filters.
     
    Caydel, Oct 22, 2006 IP
  6. Phynder

    Phynder Well-Known Member

    Messages:
    2,603
    Likes Received:
    145
    Best Answers:
    0
    Trophy Points:
    178
    #6
    Okay - I gotta ask. I keep hearing about the "duplicate content penalty" - I understand what dup content is, but what would the penalty be? Why would Google care if you have duplicated your content on your own site? Doesn't it make more sense that dup content across other sites would create a penalty?
     
    Phynder, Oct 22, 2006 IP
  7. abdussamad

    abdussamad Active Member

    Messages:
    543
    Likes Received:
    17
    Best Answers:
    0
    Trophy Points:
    60
    #7
    Ok I am no expert but I have a few thoughts on this matter. Firstly how do you know that a search engine can identify a post as a post? What algorithm could it use to do that? It can't identify it by the css or position in the source code because that changes according to the software and theme you use. Also if it is simply comparing content then forum quotes and blockquotes would also trigger duplicate content penalties. Thats just too high a false positive rate. I think this is one of the reasons that microformats are being developed so that SEs can easily identify blog posts and other elements (microformats.org/wiki/Main_Page).

    Also consider the obvious how could google compare every block of text on every webpage with every block of text on every other webpage in the same or different site? How many words should it compare? Sounds to me like its extremely computationally intensive and impractical.
     
    abdussamad, Oct 22, 2006 IP
    Phynder likes this.
  8. Phynder

    Phynder Well-Known Member

    Messages:
    2,603
    Likes Received:
    145
    Best Answers:
    0
    Trophy Points:
    178
    #8
    abdussamad - you are right on target in my mind. This duplicate content issue seems to be a lot of concern about nothing.
     
    Phynder, Oct 22, 2006 IP