Mike I think maybe I didn't explain myself too well. Google has masses of data available. via cookies and the toolbar. Google set a cookie when you go to their site. So lets run through it. 1. You hit the site and carry out a search 2. you hit position 1 (score for that search =1) 3. you go to the site, it is crap, you hit the back button and visit site#2 (score now = 3 1+2) 4. That site is crap you hit the back button, but site #3 has a crap title so you hit #4 (score now =7). 5. this site is what you want so you stay there In the above scenario the overall score for that one search you made was 7, as you hit the back button, the cookie picked this up, it 'could' also remember how long you spent on the site if it wanted to, as well as other stuff. In my original example I kept it as simple as possible. So hopefully now you can see that by tracking this way, google can track the usefullnes of sites, and the quality of the serps. I hear what you say in that 'the top site is not a quality site, people just click on it', hopefully now in this example you can see how google will know this. If the top site is crap, people hit the back button post haste, which in turn pushes up the score. Of course, Google can also tell which areas they are best/worse in from their aggregated search data.
If they're trying to do all that for every Toolbar user every time s/he visits another web page, no wonder the damn index is broken. Google has become like Homer Simpson: Every time it learns something new, it forgets something it used to know.
Minstrel I am am not saying they are, I am just suggesting a use for the stuff they have available. I totally agree though that if they are analysing to that degree, no wonder they have an 'Eric Schmidt machine crisis' Maybe we should all send some Krispy Kreme to Google
IF Google is using all their data like described by Old Welsh Guy then they have a great way of knowing/guessing the quality of the sites. If people stay for a while on the site they it might be a quality site even if it is just because people can't find what they are searching for. They also have access to this data if you are using their Analytics tool and they with that tool they also have access to all the stats for your site, where people came from, what they were searching for, not only in Google but all other SE's!! Now they have some algo that determines the quality of your site and you'll be ranked on that. It doesn't have anything to do with IBL or UBL or spam sites or anything like they are trying to make us believe (remember that Matt Cutts come from a secret service (?) institution known for desinformation!). They just need new ways to qualify our sites. If I look at my stats I can see that sites where people stay longer get a higher ranking in the SERP's and sites where people find what they want on the first page, which they used to when Google wasn't just showing the index page in their results) then these sites are ranked lower!! Maybe their deindexing is just a way to find out the quality score? They say "ok, let's just show the index page and then we'll see how long people will stay on the site". Maybe this Googles way of getting rid of spam sites? If people don't stay there then the site isn't worth visiting=low quality??!!?? Then again, maybe IBL and UBL plays a roll in this because they can see if people didn't stay long they might go to a link instead? Does these links get a quality score from that? They might be able to see from that sites stats somehow, where they came from, how long they have been on the site they came from and how long they stay on the linked site instead????? Does paranioa come to mind???? Big Daddy watching every step you take???
I actually added it to a site for this reason. It gets tons of Yahoo! and bookmark traffic but is largely ignored at G because it is under a year old. I figured maybe G would notice the high # of visits, high # of pv/visit, and user loyalty if it could see the stats with Analytics.
If you throw an infinte javascript loop on your front page, they will probably stay awhile and not go right back to google
Too often I give up on a site loading because it is waiting for Analytics. I reminds me of pages loading up when I had dial-up. I do find it ironic that Eric Schmidt will say that they have a problem,, but Matt Cutts will not. He reminds me of the Iraqi Information Minister Muhammed Saeed al-Sahaf.
But that's true - Eric Schmidt admits it, the Sitemap Team admits it, Cutts just says, "What problem? It's fixed!" and scampers out of town for a vacation hoping things will cool down while he's gone. Did you see his latest blog entry? "What I'm Doing On My Summer Vacation"? Who the hell cares, Matt? Get your sorry ass back into the office and do something useful!
hahahah Ministry of Information Matt Cutts >>Where did Eric say this? Interview with the WSJ, said Google has a "machine crisis" from overload of data.
Here is my recent exchange with the Google Ministry of Misinformation: Translation: We feel your pain. OK. Actually we don't. We understand your concern. OK. Actually we don't and we don't care anyway. But we are fully aware of the situation and what is going on. OK. Actually, we have no idea what's going on. However, please be assured that Google is NOT broken. Repeat, Google is NOT broken. Everything is fine. Go back to whatever you were doing. Nothing to see here, people. Please move along.