Some websites automatically copy content. They do this to many sites, and pose as if they are publishing those contents. How can this be prevented?
Thanks for the valuable reply but I checked and saw websites better than mine and their content was not copied. They must be doing something to prevent this.
Sorry. EVERY website page can be copied. By definition, pages MUST be downloaded to a visitor's computer before they can be viewed. Once web pages are on his computer, YOU LOSE ALL CONTROL. The ONLY way to prevent copying is to NEVER allow them to be downloaded and the ONLY way that can be prevented is to NEVER make them available for viewing.
Following on from @mmerlinn's comments you have a catch-22 - make it harder for bots makes it harder for users. If your site is never read it can never be copied. How do some of the big sites prevent it? by having a team who scour the net for stolen content and threaten legal action. You could do that too and see if the copier calls your bluff. You can do the right mouse click disable and there's a CSS tag that prevents people highlighting your text but any decent plagiarist will know how to get around that so all you're doing is penalising your legit visitor who wants to "open in new tab" or copy a phrase for facebook etc.
Most of the time the search engines can accurately pinpoint where the content originated from. If you're the original source you don't have to worry that your site will rank lower than the site(s) that copied your content. Big dogs are not going to copy your content, small fish will, but since their sites are nowhere to be found in the search engines you're not at any serious risk.
You could find a copyright infringement legal letter template online and send it to the sites stealing your content. That will at least alert the scamsters that you are on to them and it may cause them to go elsewhere for their stolen content. It may have no effect, whatsoever, but it cannot hurt.