Hello all, I have a site on which users can leave classified ads. Some people may come and place the exact same ad for i.e. 5 days which will make 5 same content web pages. Until now I was afraid of getting penalized for duplicating content so I excluded the individual classified ad pages off the index by using noindex and nofollow. Untill now I only let google index my category pages and not the individual ad pages. But I feel as if this is a burden for my SERPs so I am thinking of including all the individual ad pages in the index at the following conditions: HTML title: Title of the ad + date Body: The text of the ad + date Tagcloud: Changes all the time Here, no matter if the same ad is included for many days the date will be unique so there will be something unique in the HTML title and the body. Also my tagcloud changes all the time so it will be something unique too.. By this conditions do you think google will ban me because duplicate content or not? what do you think?
Well, I understand you but it is not something bad to have users come everyday and post something up..
What percentage of the page source code will be the same? In my experience the source code has to be >98% identical. Why don't you just generate one page that will stay live for 5 days.
I think Kemus has a point. ITs good to have people post something every day but it would be Better if they posted something different. This way you have set a rule for them and they will like it that way. This avoids spammers too as they know that its not easy to put in different content every day. I think if you add up kemus point you will prmbably reduce 10% of your estimated traffic, but what is important is that there will be at least 20-30% return visitors who will come up to your site to post genuine content. More over, think about the end user. If they are going to see the same content every day they are not going to like your site and will probably not return as they will assume its going to be the same thing there. Hope it helps
Hey hey hey hey wait a sec. there. right. Is this true? In order for google to punish you for duplicate content, more than 98% of the codes should be same? is this really accurate?
Google don't care about the code compare for double content by specific value like 98% or any other value other wise they care about the content it self like text/images and specially copyrighted content so there is no any specific value to compare double content with google.
i think they will not be considered duplicate content but google will not send visitors to those pages
And also will it cause my other valuable pages get lost inside that sea of relatively less important pages? I mean for example when you have 100 valuable pages and no other pages, you have well 100 valuable pages.. when you have 1000 not-so-valuable pages then your initial 100 valuable pages would lose value inside that 1000 not-so-valuable pages.. Is this true?