Watch http://www.threadwatch.org/node/8359 I think Adam Lasnik will show up there!!!! I've basically thrashed Google. I'm suprised they didn't censor it at all!! er, which domain are you looking at!! http://www.seo-contests.com/cgi-local/google.cgi?search=site:vgchat.com I don't think so!!! google.com is still smoking pot!!!!!
Well, you definately got Matt's attention. I think the site command is messed up for fringe sites, a lot like the preffered domain feature of their webmaster console. I've noticed matts domain is switching to mattcutts.com from www.mattcutts.com even though there are hardly any links without the www out there. I'm sure the site command is accurate for his blog, as is the conicalization issues. However for my sites the doing site: with or without the www show the same amount of pages which is 5X the actual value on the site so you know its not accurate. Basically what I am getting from all of the discussion lately is that since BigCrappy or BugDaddy was rolled out that fringe sites (not PR8 like matt's blog) are not going to be stable in the index. Which is great for anybody wanting to find something in google that's listed in a wiki, about.com, ebay or amazon. But for the 99.999% of the worlds information out there, google may or may not index that page the day you are looking for it. The site command is not accurate because it can't be. At any given moment most sites actual page count is changing with the knobs and the tides, between switching from supplimental and not supplimental, cache dates all over the place, pages showing up for phrases not shown on the recent cache, it's ever-flux. Once again, the answer is to just go out there and spontaneously get 100,000 links to your site from authority sites in your genre that are not paid for or exchanged. That's real easy for everybody to do, so quit your whining
And the site discussed is really suffereing from the "bad data push" syndrom: http://oy-oy.eu/google/pages/?url=http%3a%2f%2fvgchat.com Even though site command shows 16,000 pages, you'll only be able to view 392 or so of them.
LOL!! If you ever want Matt to reply, just say Google is screwed up!!! If any of you guys have a site with out a links section, and have the same type of site:domain.com results on google.com vs datacenters, make a reply there asking him why your site has the same bug.
But then again is there a benifit in any search engine commenting their faults to the webmaster community? As traded money making machines any negative rumour effects bottom line and their share price. And there is a benifit in having the webmaster or adsense page community on the backfoot. I would like to know why they even need to keep us happy - I would like for there to be this need - but i tend to find any quasi-official comment as just smoke and mirrors.
psst, this just in. If it wasn't for webmasters, the search engines wouldn't even exist, and Larry Page and Sergey Brin would be working at McDonalds.
He's be reacting a lot lately, taking bait, pushing back, etc. I expect a vacation soon, maybe 6 weeks off wasn't enough. All these multiple tweaks, data pushes, refreshes, whatevers have caused collateral damage that has hurt more than innocent site owners in the pocket book, it's affecting Matt's normally cool headed approach. Perhaps there is finally a conscience behind the mantra of "do no evil" while getting filthy rich and ruining others livlihood. I don't envy the position he is in, every word he writes/says must be weighed against exposing the secret sauce, however he does appear to genuinely want to help webmasters help make the web/search better. He could write a single post on his blog that would help every e-commerce site, church fundraiser, family blog, or hobby enthusiast site get back into google's index and maybe even push out a few billion pages of ebay and amazon ads (err i mean pages) but until webmasters are not rewarded by adsense for clicks and paid by conversion the system will be gamed. Notice I didn't say spam. Other than the subdomain rewrite problem they can't seem to address they have cleaned up spam. The blatant redirect you to something totally different than you searched for stuff is mostly gone. But they've expanded the definition to now include sites/pages that are of questionable quality and content. Those sites aren't spam, their just bad, which of course is a matter of opinion, which cannot be quantified with an alogorithm. I have several sites with adsense on them. You know which ones do the best? the bad ones. Obviously if you give the reader everything they were looking for they are not going to click your little ads. Give them enough to get there but wanting more and you'd get a higher CTR. Since they have now moved the bar and are defining spam as pages that aren't high quality in THEIR eyes and really don't have a way to judge such pages, they've had to fall back and rely more on Page Rank, or links to said site and page. What this has done is kill many fringe sites that will always be on the fringe, not every genre or niche is going to spontaneously generate 1000s of links. Heck there may not be 1000s of sites out there about the subject. Meanwhile authority sites now have free reign to get junk high in the serps. If I see one more about.com page I'm going to vomit. When those guys were working in their garage/dorm room thinking up this concept web pages were primordial, there wasn't an effective search engine. People "SURFED" the web more, they followed links from one site to the other, every single site had a links page. Those links were not always on topic they were just good sites the author had "found" surfing. Then the links model worked, today it doesn't, as people don't find sites the same way, they find sites by search which is based on the links which are now being evaluated differently. It's a feedback-loop, we have too many dependant variables in this equation. I've been an engineer since 1991 have managed hundreds of projects, none of which are as complex as the google search engine. Sometimes as engineers as problems creep up we institute a fix then that causes a problem, which you fix, and so on. Soon your project is just a bunch of fixes pasted together. At that point sometimes it's best to start over, rethink the premise of the original solution and go a different direction. Can google abandon links (not traded, sold, or un-natural mind you) as a vote for a page, or eliminate pay-for-click? NO, it's the cornerstone their business is built on. But someone else could, and will, and Matt Cutt's would be the first person I hope they hire.
I guess the guys at threadwatch are getting as gutless as webmasterworld, the thread got too hot, and the almighty google was getting ripped too much, time to shut it down. Or perhaps Matt asked to take it down as people had already responded and he couldn't undo their comments. First he ripped Nintendo for writing about a private email to adam, then he basically blamed Nintendo for having a links page on his site, referenced that Yahoo and ask only had one page of the site. Then went on to say that the site: command is just an estimate like he's been saying all along (as Matt and GG), but to watch the data centers that begin with 72 as those results are going to be new at the end of the summer with some new infrastructure. Not much new information other than a bit of an acknowledment that site isn't accurate. So if they can't make site:, link:, or cache: accurate, what does work with google lately? As I said over at the google groups: .
Thank you for the summary Johnweb. Not to mention the serps being screwy (june 27th, july 27th mess ups, etc). And why doesn't google understand that it doesn't matter if nintendo or anyone else has a links page, or any other funky outgoing links. If his page has the information that a searcher is looking for, google should fork it over, not hide it.
He did mention http://72.14.207.104/ specifically... There has been progress, many of the d/c's now return the home page as the first result. And the info on that one is definitely spreading across d/c's today.
Yep, I like the 72.14.207.104 results!!! That's almost what I would like to see on google.com. I think it's dancing. Now that site shows 'about 37,100' on most of them, and results go to 1,000. http://66.249.93.19/search?q=site:vgchat.com&hl=en&lr=&start=990&sa=N&filter=0
Damn... the site: results seemed to have reversed on many d/c's overnight. They were looking pretty good, I do not understand. Let's hope they have a fix and this is just a data refresh to get a more accurate result, and not some junior engineer making changes detrimental to the site: command.