Hi, I started a new blog few months ago and used to spend 18 hours a day on its marketing and content creation... well, few days ago upon not getting my blog in search results i checked it in Google tools.. i was shocked to see my blog in sandbox. I contacted professional SEO to sort out the issue, he told me that due to copied content my blog was sent to sandbox. When i passed my blog through CS i got 10 results for every post i published meaning that more than 10 websites were constantly copying my reviews( my blog was new whilst their those who stole content have well ranked websites). So, I am here to have advice in the regard. Is there any way to protect the content from thieves? I am trying to set up a new blog but this time i don't want to take any risk with my content. Please help. thanks
Although th eOP has been banned (presumably for multiple IP usage), his point is interesting.The BEST way to protect your content is to embed referecnes to your site in it, so that even if it gets stolen, it's still promoting you. "We here ate contenboss.com think that..' blah blah. I'm getting tired of recommending this - it's such a simple technique, and defeats 90% of all automated scrapers 9although it won't defeat someone who manually nicks your content, of course).
Hi, It’s possible to prevent people copy-pasting site text. I’m not sure of the scripts alas. Google no doubt ought have the answer. Thanks! Peetr
Once you publish it online, it is very difficult (practically impossible) to keep someone from copying it. While it is not right and it is against the law, it is very time consuming and costly to really try to go after these people.
Your solution works for some point but there are automated scripts they can disable links in original post. It happened to me. When I click the link my stolen post it redirect again to thieves site.
It's so difficult to protect a copy when it published online. Guess many people are copying contents from other sites.
I'd like to see a script that could deal with 'Many times, the developers at jocksoft have ...' If it's isn't an explicit domain, it's hard to weed it out. (although we have scripts that can, obviously - you need a big old db to run it though).
One of the reasons to include references to your own site is that it makes it much easier to prove copyright infringement and who was the original source of the materials. It should make the decision of the webhost fairly easy when it comes to pulling infringing content and/or sites. As to whether or not links are automatically stripped, I'm finding that however they are scraping the content often does involve dropping the live links but they often leave the reference. If you are writing a piece that you suspect might become link bait, it's always a good idea to include your site name as live links and plain text. Sadly, nothing's foolproof. But, referencing your own site within your content at least makes it easier to find and go after the cheats.
most scraping is automated. Find and ban their ips. Or supply them something more 'interesting'. That stops all but the hardcore scrapers who understand about proxies, spoofing etc.