BD happened months ago. Apparently something happened on the 27th that didn't affect any of my sites, and I missed where it was talked about. It's not Google spin, it's me saying I'm lost, could someone please clue me in. If you're saying that what happened on the 27th wouldn't have happened if they hadn't have done the whole BD switch, well then, fine, ok, what wouldn't have happened..? -Michael
I hate to naysay, but it seems the Google's honeymoon with us is over. All the irritating little traits are starting to come out. In previous updates, some of our sites would invariably have issues crop up, but we'd regain our positions once we'd learnt to remove the bad things. There was always bitching, but the algo changes would always remove some of the spam and the bad issues that plagued us - they'd, to an extent, give us some of what we'd ask for. Big Daddy came out of nowhere and forced itself upon us, it's hard to find people praising it, and I've never had such a large amount of shuffling occur to sites I observe. A lot of extremely useful resource sites that I'd visit have dropped out of the index, only to be replaced with copious amounts of spam. Big Daddy issues are inversely proportional to Google love.
Exactly what I'm seeing in some instances. I hate the idea of having to add "content" to a "product page" just to get it in the RI. It serves no purpose other than to clutter the page, the last thing a potential customer wants. But to Google, gotta have it to be deemed "worthy". Dave
Not to appear the Google fan-boy or anything but as the leader of the industry I think they are taking chances in order to maintain their position and to offer a better product. I don't agree with the poster that said Google doesn't care, I think they do and it is in their best interest to fix these problems as soon as possible. There is a huge issue with spam right now and Google has to face it now because it ain't gonna get any better down-the-road.
There were some reports of toolbar PR updates pre-July 4th weekend. I got an email from a friend telling me one of my sites had a huge jump in PR, when I went to look - nada. Maybe I can get Google to pay for my impending re-hab? Bartender...
One thing I noticed on one of my sites, that Google bot stopped by the last time on 6/26 and is missing in action ever since. All pages are still in Google (not supplemental), but I do find it strange that the last visit was the day before the 27th ..... (or maybe I am just paranoid after reading this entire thread) lol Christoph
Just when I think I have things managed back to normal (Google's normal) I wake up to Google site: my website and I'm missing pages again and search engine results seem to be wonky. 1. Indexed pages ranged from 300,000 pages indexed to 90,000 (but since the June 27th mess, its been leaking indexed pages. Down to 55,000. Before that date, I finally was feeling things were back to norm) 2. Search results ranged from #1 - #3 until June 27th. Now I'm #3 - #5 among keywords where I *am* the #1 site (and have been for many years) and maintain that ranking in yahoo and msn...but google, not any more. 3. No response that makes sense from Google. Vague pointing to the quality guidelines to make folks think they are doing something wrong and must figure it out. If I've changed nothing and I meet the standards, what in the world am I to believe? Its downright frustrating.
That pretty much sums up the type of problems we're seeing everywhere. It's also the same frustration. None of us know what google wants. We all thought we were abiding by google's guidelines. And we were, for YEARS. Now the guidelines have changed. Google will take credit for the positive changes, but quickly lay blame at the feet of the web developers (us) for everything else. And then the real frustration sets in when we try to resolve our deindexing/supp-hell issues. We WANT to do the right thing. But google won't tell anyone what that is. They won't even admit anything is wrong. Google's just another typical corporation. They're as evil as the rest.
Google got their arses spammed off last month to the tune of millions and millions of pages under the subdomain thing. I have a funny feeling that some of this is as a result of that. Don't ask me to back it up as I honestly can not, I just have a funny gut about this. BD is rolled out, it never really finishes (although G swears it has) there is constant varifiable noise across the decent forums, and Google keep making excuses. The latest being 'bad dat push WTF that is is anyones guess. To me something that pushes bad data is a crock of shit, so maybe that is what they meant I would say that BD was a monumental failure as the index is shite from datacentre to datacentre. I am seeing MASSIVE differences in the numbers of pages returned for phrases I am working on. I am talking about differnces here of 78 million sometimes, ranging to 330,million on others, it is pathetic, and nothing will convince me that there isn't a problem there. It appears to all intents and purposes sometimes like truncation of data.
OWG you just reminded me of something. Back in 2004 I had 240k pages indexed on my poetry site. Thing is, there was actually only a total of 100,004 pages that existed. I'm wondering if there isn't a problem more at the core of things, that keeps leaking through the patches. Maybe some logical flaw that they missed near the beginning, a memory leak overlooked somewhere, a missing semicolon... -Michael
How many of you effected by this change in the algo are using 301 Redirects? I think it could be some kind of duplicated content penalty. My two effected sites use 301 Redirects: one from domain.com to www.domain.com and the other from www.domain.com/theme to theme.domain.com. Trying several searches I find the old URL as Supplemental Result but with good rank in SERPs.
Did that include the 'add poem' pages that were replicated constantly? Also it might have been canonical. I think that google is looking at spam fighting, and it is killing as many good pages as it is removing.
The first thing that needs to be known is when it comes to the "number" Google places on their results, whether it be through the site: operator or a query, is where Google pulls that "number" from. I have long suspected, and it was recently confirmed, that Google stores duplicate data for the same URL in both the RI and SI. Quite possibly triplicate if you take into account URL only data with nothing but anchor text pointers that were yet to be crawled at some point in time. If Google is not able to reconcile the duplicate data it has stored internally it's certainly quite possible that this is the cause of many of the problems being observed. Dave
You need to change the 'm' to 'b' Based on what Nintendo revealed in his thread, there were billions of sub-domain spam pages.
Often times google will say 1-10 of around XXXXXX . If the number is below 1000 then you can skim through the pages until you get the actual amount. If it is over 1000 then your buggered as G doesn't show more than 1000 pages in a search. I agree though it might well be multiple indexes.
There was no add poem, but there was a vote link, that might have contributed. And yes, definately canonical was going on, which could have theoretically doubled it. I don't actually remember when I fixed the 301, but I don't think it was all that late in the game. Ya know, looking back, I probably should have just left the pages at 100k. It's after I changed it to 1 million (per language), which changed the linking structure, that I went supp. -Michael
I refuse to use your tiny little Billion word In the Uk we treat million and billion differently to you Murcans. But yeh your right, it was milions and millions and millions etc