Dude, I dont know what else to say G has a low trust in inlinks and outlinks because of the points mentioned. Your taking things out of context.
As a spokesperson for Google, if my object was to stop site owners from using anything but natural links, my first strategy would be to create the rumor possibility that using such link exchanges are the reason for the random site delistings. Hmmmm....
Fair enough, but I don't understand how Google can differentiate a whole bunch of crap links that your competitor directs to your site vs. other links. They are all inbounds, which the webmaster has no control over. Not that I have to worry...
Do you think every joe blow can throw a ton of links at their competitors? Not every webmaster has that kinda resources to be able to afford that. In any case, if a competitor did throw a ton of crappy links at one of your sites, the way I see it, the links would just get devalued in some way. If you want to quote Matt Cutts, hasn't he said something along the lines that a competitor cant do anything to harm your site?
I believe the quote was "there is very little a competitor can do" to harm your site. Leaving open the possibility..... yea, it is obvious, isn't it? Looking at the sitemaps stats, G tracks exactly what % of phrases are used to link in. If certain outside phrases start dominating that list and taking over the natural phrases used to link in, I bet that would have some effect (bowling). Hmmm.....
I lost interest in this thread - check out the latest about Shawn -> http://forums.digitalpoint.com/showthread.php?t=85661
We should take his word over Eric Schmidt's? Oh, yeah. The disclaimer... Until Google can crawl all sites and not get the content and domains mixed up, that entire post makes no logical sense. People posted in the MEGO comments about the problem, people are posting the same comments on this one. (Fisting lessons with your new house, anyone? ) Google simply cannot properly communicate properly with every server on the web, and according to Matt Cutts, it is not Google's fault, it is our own (or our hosts, admins, whatever), and dropping pages is our fault. So where is it that Google does not dump you for mixing up the content for a site with links they penalize and one they don't? Related links: http://forums.digitalpoint.com/showthread.php?t=83585 http://forums.digitalpoint.com/showthread.php?t=85423 On a positive note, I can affirm that the hyphenated domain problem is/is being fixed. I went from ~700 pages to ~12,000 pages today on a new site that was getting crawled without much showing in the index for a site: search.
I think the inlinks that Matt was referring to was those coming from non themed sites. I take this as Googles way of fighting link purchasing. Another take on this could also be that if you were using say real estate as anchor text from a quilt design forum pointing to your real estate site google might find those links questionable and devalue your sites trust. Links are now about more than just anchor text. IMHO I think Google is trying very hard to impliment trustrank heavy in this algo. IMO I think this is another reason for this new algo. Google knows that there are a lot of webmasters out there right now that control huge link farms. Real estate is real bad right now with a lot of seo companies and webmasters who control a large number of farm links.
I still believe something is broken over there. Matts comment are just to buy time in hopes that they can fix it. If what he was saying is true it would have come out 2 months ago, not now when everyone is screaming.
I got one hyphenated domain that suddenly went to having over 300,000 URLs site:domain.com indexed, worth 60,465 in the co-op, though my other hyphenated domains havn't been fixed yet.
What's wrong with my site about 'making cashews out of avocado skins' linking to a site about 'using wheat germ to explore for oil'? Since we use the extra cashew shells to make explosives for seismic exploration. How could G ever compile a database of what are valid link relationships? Me thinks it 2 grains of salt are in order.
I care less and less all the time about what google is doing. Personally, I think they got a mess over there that they haven't been able to fix since Jagger, and information that they let leak out is to do nothing but scare you into doing things they way they "wish" you would do them. Has anybody else notice an increased amount of traffic from the other search engines? I rank well at google, and fairly well at the other engines as well. My rankings have been fairly consistent, yet it seems each month google is bringing me less and less. I figure that will be even more significant once IE7 starts rolling out with new computers, and becomes a regular update for everyone who is currently using IE6.0 I think people worry about google way too much.
I don't agree completely with what matt said. I have a website which doesn't have even a single link-out.. and have quality only (also closely related) backlinks to the site.. but even then the site keeps dropping pages every now and then.
I agree with Roman. This isn't an explanation for what's happening at all. From: http://www.mattcutts.com/blog/indexing-timeline The problem is his explanation does NOT adequately account for most of the sites affected, from what I can see. Also, in that blog post, Cutts claims that problem was fixed after receiving those reports. Well, they might have fixed whatever was affecting those particular sites but that still leaves a lot that isn't fixed at all and isn't even accounted for by his explanation. I think maybe what we're talking about is something else. Okay, I can buy it's not Big Daddy, if Cutts wants me to believe that. But the problems we're seeing started with Big Daddy or around that time and they are most certainly NOT fixed yet, no matter what Cutts says. Something is definitely broken in Google. If it's not Big Daddy, what is it? I am really not appeased at all by Cutts telling me the problem is fixed when it obviously isn't. As for the reciprocal links issue, Cutts is not saying reciprocal linking is bad but that excessive reciprocal linking, buying and selling links, and linking to bad sites is a problem. I think, reading between the lines, that he's talking about non-relevant linking, as someone else here suggested (LVH?).
You should know better than anyone, Minnie, that quite often the problem is what is not being said. So when did the crawl caching proxy servers go online? Conspicuously absent from the timeline, with no mention whatsoever.