Hi, I am considering getting a number of articles re-written by professional copywriters. What I want to know is, is if there is any easy way to measure the difference between the original article and the new re-written one - IE is there software etc that you can easily check the difference between the two documents (bearing in mind that they will not be online at this stage). If there is a method of doing this, what should I tell the copywriters is a minimum standard of difference I will accept as a re-written article? - IE 30%, 40%, 50% etc etc
This is actually very simple to find out. Click dupe free pro on Google and you will find out. As for the second question, depends on your needs: around 75-80% would be more than fine.
If you're willing to pay for a "professional copywriter", why not have them write original materials for you? My guess is that a truly professional copywriter wouldn't agree to take on a rewrite job for a new client - no trust built to ensure the content hasn't been stolen sort of thing. Additionally, it also sounds like you need a content writer, not a copywriter.
Yeah, that is what I was thinking. I'd rather go through root canal with a half-drunk med student behind the drill than to rewrite an article. Ok, maybe not that. Google knows that there are, God knows, how many articles out there that resemble one another (Hell, even some of Matt Cutts' pages from his blog wouldn't be indexed!). That can't be helped. So I wouldn't worry too much about "30%, 40%, 50% etc etc." Since Google's spiders have been programmed to focus more on backlinks and the age of a site, I'd work on getting backlinks to those specific pages. Besides, it isn't a 100% certainty that the spiders even ignore pages that has duplicate content. I have seen sites that had their pages on page one of a search, and there was some duplicate content on their pages. But, when in doubt, just don't do it. Yes, there is a very easy way. ... Look at them with your own eyes! The best software for this is your brain. ... Use it.
A good copy writer is also a good original writer as I think.. In the world of blog it's very hard to fine the original article for a keyword because of the reality that we are informing one thought.
This is a common misconception. There is a vast difference between gathering the data for an index, and organising it for searching. A page containing nothing but 'beer beer beer beer' will get spidered - after all, it's a page. It just won't rank for anything anyone will search for. Most search engines operate an internal exclusion list, but you have to be pretty bad to get on it. They also try to follow the 'robots.txt' directives, but once again, with varying success. As anyone who has ever seen their 'cgi-bin' contents listed by Google will understand.
I think it depends on subject as well. For academic papers the minimum is 20% original content. If you're trying to re-write something you can also try playing with paragraph order, sentence structure etc.