What similarity of two documents is considered unique?? - Any software to test this.

Discussion in 'Copywriting' started by Big Zee, Nov 29, 2010.

  1. #1
    Hi,

    I am considering getting a number of articles re-written by professional copywriters.

    What I want to know is, is if there is any easy way to measure the difference between the original article and the new re-written one - IE is there software etc that you can easily check the difference between the two documents (bearing in mind that they will not be online at this stage).

    If there is a method of doing this, what should I tell the copywriters is a minimum standard of difference I will accept as a re-written article? - IE 30%, 40%, 50% etc etc
     
    Big Zee, Nov 29, 2010 IP
  2. etienne

    etienne Member

    Messages:
    211
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    28
    #2
    This is actually very simple to find out. Click dupe free pro on Google and you will find out. As for the second question, depends on your needs: around 75-80% would be more than fine.
     
    etienne, Nov 29, 2010 IP
    vishals likes this.
  3. YMC

    YMC Well-Known Member

    Messages:
    2,787
    Likes Received:
    404
    Best Answers:
    4
    Trophy Points:
    190
    #3
    If you're willing to pay for a "professional copywriter", why not have them write original materials for you? My guess is that a truly professional copywriter wouldn't agree to take on a rewrite job for a new client - no trust built to ensure the content hasn't been stolen sort of thing. Additionally, it also sounds like you need a content writer, not a copywriter.
     
    YMC, Nov 29, 2010 IP
  4. Perry Rose

    Perry Rose Peon

    Messages:
    3,799
    Likes Received:
    94
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Yeah, that is what I was thinking.

    I'd rather go through root canal with a half-drunk med student behind the drill than to rewrite an article.

    Ok, maybe not that.

    Google knows that there are, God knows, how many articles out there that resemble one another (Hell, even some of Matt Cutts' pages from his blog wouldn't be indexed!). That can't be helped. So I wouldn't worry too much about "30%, 40%, 50% etc etc."

    Since Google's spiders have been programmed to focus more on backlinks and the age of a site, I'd work on getting backlinks to those specific pages.

    Besides, it isn't a 100% certainty that the spiders even ignore pages that has duplicate content.

    I have seen sites that had their pages on page one of a search, and there was some duplicate content on their pages.

    But, when in doubt, just don't do it.

    Yes, there is a very easy way. ... Look at them with your own eyes!

    The best software for this is your brain. ... Use it.
     
    Perry Rose, Nov 29, 2010 IP
  5. nifty87

    nifty87 Member

    Messages:
    37
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    41
    #5
    Dupefree Pro is the solution to your problem.
     
    nifty87, Dec 17, 2010 IP
  6. dmester

    dmester Peon

    Messages:
    29
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    definitely dupefree pro...and target at least 70% original
     
    dmester, Jan 2, 2011 IP
  7. rseobeginner

    rseobeginner Peon

    Messages:
    208
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #7
    A good copy writer is also a good original writer as I think.. In the world of blog it's very hard to fine the original article for a keyword because of the reality that we are informing one thought.
     
    rseobeginner, Jan 2, 2011 IP
  8. contentboss

    contentboss Peon

    Messages:
    3,241
    Likes Received:
    54
    Best Answers:
    0
    Trophy Points:
    0
    #8
    This is a common misconception. There is a vast difference between gathering the data for an index, and organising it for searching.

    A page containing nothing but 'beer beer beer beer' will get spidered - after all, it's a page.

    It just won't rank for anything anyone will search for.

    Most search engines operate an internal exclusion list, but you have to be pretty bad to get on it. They also try to follow the 'robots.txt' directives, but once again, with varying success. As anyone who has ever seen their 'cgi-bin' contents listed by Google will understand.
     
    contentboss, Jan 3, 2011 IP
  9. konnessi

    konnessi Peon

    Messages:
    6
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #9
    I think it depends on subject as well. For academic papers the minimum is 20% original content. If you're trying to re-write something you can also try playing with paragraph order, sentence structure etc.
     
    konnessi, Jan 4, 2011 IP
  10. dotmu

    dotmu Greenhorn

    Messages:
    73
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    16
    #10
    Dupefree pro is the answer!
     
    dotmu, Jan 4, 2011 IP