Content in terms of SEO means the text in your page, information which u have placed on your page.that is called your content. it does not include HTML. Search engines gives some point if u have used html tags properly for exp in your content u have to give h1 tag to your targeted keyword. you have to use alt tag for your images. and copy means search engines have some criteria that content must not same as other already indexed pages otherwise it will be panelized by search engines. but still i m doing some experiment on this b'coz i have seen so many sites which have same content in different pages but still they are doing well on serp. and sites have that copy content since long time. that is why i have conclude that if your site is new than content must be fresh but if your site is old and have more pages indexed that u may have some copy content. i want to listen comment abt this from other experienced guys here. -jigs
I bet that any sites with same content will be hurt in next update. Google is way behind in terms of link indexing and new sites. They must be haveing hard time with the millions or billions of pages they copied from books so it seems like alot of sites with same content are still around. Some search results even have sites that have been down or gone over 6 months and that should not happen.
It is what we called duplicated content - different title but same content and you will be penalized for using that because it is spamming. You may read this article about duplicate content http://www.webconfs.com/duplicate-content-filter-article-1.php
read the question before you answer....his question had nothing to do with duplicate content... seojig got it right....content is text that appears on a webpage after you strip all the html and formatting out
If you take stuff off of Wiki and use it for your site you might see that page get dumped form Google. Thats what recently happened to me.
that means dublicate content and all search engines hates this . you can be banned for content copy. forget it and write your own article
If it were related to HTML, all websites using the same script (phpld, joomla etc.) would be banned to hell Though bots aren't very bright detecting duplicate text either, unless you are too lazy to modify the texts a bit
Any SEs would like to pull ONLY the best and relevant websites in response to any query. So, they would do anything with their search algo to get these sites (using duplicate content) out of the SERPs. Some are good at this and some are still fiddling with their algo to perfect it.
Yeah it’s just like playing a bit with the text so that your modified content transforms into unique content according to the SE’s.