Hello to all, After banging my head with this situation I turned to (digital point) where I was told I'd get the best advise from the pros. Heres my prob. I have an auction site and have members submiting listings that are very similar in the title. IE: (Acura Integra 90-93 Erebuni Body Kits) - (Acura Integra 94-97 Erebuni Body Kits). Now we do have lots of duplicate listings but we also have lots of similar. We've been penalized by google for this and now looking for a fix. Any help would be greatly apreciated. My site is www.autogearbid.com
Have you tried making the meta description unique for every item? You can use part of the item's description for this. If you search Google for site:www.autogearbid.com, many of the results have the same description, which may trip the duplicate content filter. I'm not entirely sure that you necessarilly have a problem. If you seach for "ZENN Lowering Springs 95-99 Eclipse 2wd", autogearbid.com comes top of the results, even though it's marked as "Supplemental". Cryo.
That's a little difficult. Good that your getting user-submitted content, but bad that it's duplicate. My suggestion: I'd have various types of content rotate throughout the site on page-load. When a crawler hits the page, a "news" section, other listings or recent forum topics could be displayed on each individual listing, adding different types of content per page.
Is this problem hurting me? I just checked ebay out site:www.ebay.com then clicked on repeat with omited results and noticed mostly all of ebays pages are also penalized.
Need some more input guys. I got some advise on a solution that I'd love your input on. I was told to make my listing pages to allow index/follow but no archieve. What you think???
What does it mean to ban them from archiveing your content? I thought they always archive it (cache) if they - want to do that (have the reason, e.g. your page has been changed since the last cache update); - you allowed them to visit your page (index/follow). I am not absolutely afmiliar with this, but generally, it works like: - you allow a bot to visit your site (respectively you do not ban it from visiting your site, which should be the same); - a bot comes based on the criteria it visits the pages; - it takes a snippet of your page or all of it and checks your links to other locations (something to be followed later); - if the new snippet is different than the old one, it potentially could replace the old one when the cached results will be chnaged. The cached results are being used to judge the relevancy of your content to the searched phrase, so why should you ban the bot from archiveing/caching your content? It makes no sense tom, but I repeat I am not very deep in this, if you have arguments, I will change my mind.