Over the past few months the deindexing of directories has been noted across the web and this seems to be even more ferocious recently. Examples include: http://www.google.co.uk/search?q=si...C,GGLC:1970-01,GGLC:en&start=80&sa=N&filter=0 http://www.google.co.uk/search?q=si...C,GGLC:1970-01,GGLC:en&start=40&sa=N&filter=0 http://www.google.co.uk/search?q=si...,GGLC:1970-01,GGLC:en&start=860&sa=N&filter=0 and I would have said three months ago that these were amongst the best directory sites. Anyone getting close to understanding what the **** G is up to?
I've been watching these two (2) datacenters: 64.233.179.104 = Results 1 - 10 of about 67,900,000 for mobile homes 216.239.57.104 = Results 1 - 10 of about 16,200,000 for mobile homes
is it all directories that they apear to be cutting down on, or just the paid ones ? and is it sites that are listed in a lot of paid directories that have droped in serps of late ?
its happening to all types of sites that have simular template layouts, my sites included. im conducting changes in the templates so they are picked up by google again. something to do with the dup content sensors. -Dan
What sort of changes RE? I mean, how sensitive are we talking... The reason I ask is I have 2 pages that use a template system, both are being either dropped or removed from Google, but they are only 45% similar.
Well, the supplemental pages for Site-Sift are OK. The supplemental results are all the dynamic URL's we used back in early OCT. It happens to many websites when they switch to static from dynamic. I hope you understand this better now.
Check this out: http://www.google.com/search?hl=en&lr=&ie=ISO-8859-1&c2coff=1&q=site:dmoz.org It happens to all directories and has for a long time. Actually it happens to tons of other sites as well.
Ok, so if you feel 30% is ok, what are we talking about then that is duplicate in that 30%. Text content, meta information, images etc...?
Does anyone know of a similar situation with regular sites that have large templates that repeat from page to page (the same menus and links in the headers, footers and sidebars).
My pages that are under 30% duplicate do not have many common elements other than the footer and a small order form.
Ok, this isn't adding up. Google cannot be penalizing sites because we use the same design and layout on our pages on our own sites. That's silly. If I can't use headers and footers/ stylesheets / etc then how in the heck am I supposed to build a site??? What's next? We can't use SSI files because they replicate on each page on the site for navigation? That's the whole point of an SSI file! Nah, that's not what's causing pages to be deindexed or thrown into supplemental hell. It's something else. I don't know what, but it has to be something else.