What really bothers me is the link cloaking. If you're going to put a link on your site, then put it there, don't do some redirect, hide it from google crap. Those links make the page useful to the reader and as such should be taken into account when the search engines determine page content. A number of you said it. The search results returned by Google are lacking. I often perform a search and have to visit 20 different sites to find the relevant content I was looking for. There is much more evolution to come from the search engine technology, and it will be interesting to see who comes out with it. The next generation will have a much easier time finding the information they want.
I did a search, and nobody has mentioned google-watch.org on this forum! This is THE site for "the case against google" and even if you don't agree with everything they say, there's a lot of fascinating stuff. Including a google/yahoo comparison tool. A yahoo sitematch analysis tool. A report on the google cookie. IPO bashing. And much more. Google/watch has certainly has helped me fashion a worldview. Go check it out.
Boo! Everyone knows about Google-watch and Google watch watch and even some other "so called anti Google sites". The SE Wars are on and have been on for quite a while, the industry is changing fast, so hold on tight.
Interesting posts typyahoo. IMO google is not motivated against this practice or that but have only one simple and clear search objective - to provide relevant results. So long as a page produces relevant results why should Google bother why or how it produces those results? If a page uses cloaking for example and comes up #1 in a Google search for blue widgets and is in fact a very good resource for blue widgets thats what the searching public wants and what Google wants too. We sometimes get so wrapped up in the details of the business of getting good ranking results that we may forget that the searching public neither knows nor cares why that page is ranking only if it can reasonable considered one of the best pages for thier use. Search engines depend upon that customer satisfaction for survival.
Mel, I respectfully disagree. Google can't provide good search results if the content it thinks is in its indix isn't the content the public sees. They're at the mercy of the cloakers. Cloakers sell clicks to the highest bidder. Clicks from google left side "organics." They just redirect to whoever is willing to pay the most. Kludgy example, but gets the point across. Cloaker gets a page high for coca cola, and sells the clicks to pepsi. The cloaker decides what the user sees. Not google. So, my interpretation of the fact that google isn't going after cloaking in a more hard core way is just, they don't care enough. Yet. Maybe because, getting their ducks in the row for the ipo, maybe there are more important fires. Similar to how they don't care about link spam. Yet. Over and out.
I meant to ask anthony, if he was so smart, what were those other sites that are covering this search engine war stuff. But then I checked his profile noticed his home page would seem to be the answer. http://www.searchwars.squarespace.com/ Good link collection, anthonycea. I noticed one to an article by seth finkelstein of fairness and accuracy in reporting, who was the journalism mentor of a schoolfriend of mine back in the day. Nice when old names pop up like that. http://www.searchwars.squarespace.com/display/ShowPage?moduleId=23698 http://www.sethf.com/anticensorware/general/google-censorship.php t.
I guess I better check your profile out now, man are you trying to get me in trouble or ruin my reputation around here?
You totally missed the Point tphyahoo Which is the requirement that the page be relevant, IE if it ranks for coca cola it must be about coca cola NOT Pepsi.
Well If google doesnt care if the website is relavant..then maybe you should talk to the guy that owns a web design and seo company in Las Vegas. This person had 5 of the top ten websites listed in googles top 10 for real estate and all those websites had very good content but he was cloaking and guess what....google ban him and those websites.
Please read it again Atlanta, I have strictly specified that Google is primarily interested in relevancy, not the other way round. If a particular company has five different websites listed on the first page, then those are not relevant results, and may be the cause of the ban. How do you know that he was banned for cloaking, since I doubt that anyone but the website owner or Google would really know that.
And whenever Google does that, it apparently punishes a lot of general websites as well, who have made no great effort to SEO.
Hi Mel, maybe I did miss the point but I don't think so; anyway let me try again. My point is that by using cloaking, link spam, and other "google no nos" that will supposedly get you kicked out of the index if they catch you, you CAN get a page ranked high for coca cola even if the words coca cola appear nowhere on the page, and the whole thing is a pepsi advertisement. You just need a lot of links that say "coca cola" pointing to it. I mean this theoretically, of course, in the real world you probably couldn't do this to coke because they will sic thugs and lawyers on you. But I have seen a page selling *new* laptop computers, with the words *used laptop" nowhere on the page, ranking in the top 3 for google, for the words "used laptops" (in german). They had a lot of hidden links with "used laptops" as anchor coming in from hundreds of template pages on a semi legitimate news site with decent PR. Sure google will eventually catch this, we think, we hope, but it can take quite a while. So, that is what I meant, and why I disagree with your statement that if it ranks high for coke it must be about coke. Enough on link spam. With ip cloaking, the html google sees is not the html the rest of the world sees, because they programmed their webserver to serve up something special when for clients called "googlebot" whereas clients called netscape or IE6 or whatever, will get something else. So, google might see something relevant for coke, but everyone else would be getting something about pepsi. Do you still think I missed the point? Thomas. PS Coca cola has one of the strongest brands in the world and they will hire lawyers, and who knows maybe thugs as well, to motivate you not to mess with their brand http://www.guerrillanews.com/cocakarma/ Coke/Pepsi was just an extreme example to get the point across
In discusions of the relevance of Google (and other SE) SERPs, there seems to be an unwarranted silent assumption: that if the pages placing high in the SERPs are all fairly relevant, the search process is successful. As General Wossname supposedly said, Nuts!" (I doubt that was the exact word, though "nuts" is a common synonym for it.) There are two equally important parts to a successful search process: pages that rank high must all, or almost all (no one expects perfection), be pretty relevant; and pages that do not rank high must all (same caveat) be pretty much not relevant. By and large, Google fails the second half, usually miserably. If you find yourself with some hours on your hands one rainy afternoon, spend those hours doing Google searches on several diverse topics each of which you know well--at least well enough to accurately recognize germane pages. Go through the searches to a depth of at least 300, preferably 500, page by page. Yes, you will find that the listings on the first two or three pages are fairly relevant, with a more or less decreasing relevance as you go on. But you wil also almost surely find that out around 150 or 200 you start to pick up a fair scattering of pages that are at least--often arguably more--relevant and, in particular, informative on the topic than those on the first two or three pages. That nontrivial scattering will often persist through 300 or 400, sometimes deeper. What's happening here? Pages, and sites, set up by SEO-cognizant sources--sometimes corporate, sometimes individual--will be sent off at birth with a generous dose of PR via crosslinks from other sites controlled by the same owner, and their operators will buy more, and know how to wrangle yet more in exchanges. The sites with decent PR are usually also the sites with decent SERPs, and this is indisputable: other sites will link readily to high-SERPs, high-PR sites, giving them yet higher SERPs and PR, making them all the more attractive, and so on up the spiral. Now, at the other extreme, we might have an assistant professor at a small college who happens to have made Topic X his life's work as a sideline from his academic efforts. He puts up a modest site on Topic X that contains more real info than the top 10 combined, but he isn't SEO-savvy, isn't funded, and quite possibly isn't even deeply motivated to get any sort of SERPs. His family and friends will find the site, but few will link to it, and almost none of them big sites themselves. His #387 placement will only drop as time goes by, owing to other sites entering the Topic-X arena and climbing right over him. Yes, that's one made-up anecdote; but go out and put in the hard time on the web and you'll find that it is not a terribly artificial one. A very many deeply expert individuals put up rinky-dink little sites (rinky-dink as to form and presentation, not content) and get totally lost in the SERPs despite their high value. A real search engine--one that worked wholly off on-page content--would find such sites and place them as high as their content actually justifies. Till we can say that few or no such sites are being missed, and falling deep into the SERPs for no good content-related reason, those SERPs are so much flatus. And please, don't say that content-based SERPs are impossible to do reliably, or too subject to spamming. Whatever are we always talking about except the present gross spamming by black hats and the subtle, clever, ah, optimizing by white hats? I have said before and will say many times again that a SE company that cannot reliably filter out spam in on-page content is just a bunch of kids palying at being a SE.
Any kind of SERP determination will fail if not done well. The early SEs were, well, *early*. They came, and went, before much knowledge was really available. I repeat that--at the current state of the art--as SE company that cannot reliably filter out on-page spam is a farce. I have mentioned before that successful filtering, whether of terrorist movements into a nation or of spammed pages into SERPs, is not a complex one-shot thing: it is a matter of successivlely applying several independent filtering systems that need only, on an individual basis, be reasonably successful. A modest number of such reasonable--not super-effective--filters, so long as their methodologies are truly independent, will stop all but a microscopic percentage of intruders. But, as with the internal-combustion engine, everyone wants to do wjt the other guy is doing. Except that with search engines, as opposed to automotive engines, results will weigh much more with the using public than "what's under the hood" type discussions.
Returning to the original subject in this thread: If this is true... How come they allow companies selling links to advertise such services in their AdWords program? Just make a search for phrases like “buying links†or “selling links†or “buy text links†and you’ll see that they are happily taking money from companies that clearly “abuse†their quality guidelines.
This post is mainly in response to what owlcroft wrote. Owlcroft, what you say is very interesting to me, and it's on a theme I ponder a lot. I disagree with your main claim, but I wish you could bring me over to your side; it would make the world a better place, anyway it would make it easier for people to find the information they are searching for, and reduce the everyday stress of searching for the good stuff in reams of SE spam. Owlcroft, your claim as I understand could be paraphrased as follows... maybe with a little embellishment, but following your thrust: "On-Page spam techniques could be filtered by the SEs if they would just do their jobs. Reliance on link weight is just a copout. Google claims favoring links rather than on page is a good idea, because though there is link spam, it is easier to fight link spam than on page spam... but this is bullshit. If SEs would just hire better programmers, and apply better filters, they could shift to SERPS weighted more heavily by on page content. They're just lazy. When a search engine comes out that gets the on page ranking thing right, they'll have a chance to put google out of business." Okay, Owlocroft, Digital Pointers... http://www.quoteserver.ca/cgi-bin/dbase That was an example of an automatically generated gobbledygook page, which I believe could pretty easily fool search engines into thinking it is natural language. In this case, the purpose isn't for onpage search optimization optimization of targetted words (it's to trap spam spiders in fact, not evil at all)... but you can easily see how this could be adapted to spam search engines. A couple of months ago, if memory serves, google had indexed twelve thousand or so of the gobbledygook pages. Now they have been removed from the index, finally, but consider: -- These pages weren't even TRYING to use seo to get into the google index. -- The whole thing is like, a hundred lines of perl. It's easy. How hard would it be to vary the template/table structure/et cetera, to give the thing even more of a "realistic, not template" feel to search spiders. If that was a hundred lines of perl, think what you could accomplish with a thousand lines. I haven't dug too much on this, but I suspect there are plenty of more sophisticated, seo targetted, bullshit autogenerated seo spam pages lurking out there; the only reason there aren't more is that google is favoring links not on page, so the spammers shifted focus to blog spam sort of stuff. But if google took owlcroft's advice and tried to get the on site thing right, you would see an explosion of such type pages, that would overwhelm google's attempts to fight this, even with the "multiple filter" type approach that owlcroft advocates. The key economic problem is that it is MUCH easier to write a script that generates bullshit html, than to generate a filter, or set of filters, that somehow fights the on page spam. And there are thousands of seo spammers out there figuring out how to do it, and they are well funded. Google is well funded too, but they have a budget. I think you could compare this with the "star wars" US nuclear missile defense system. It was a cool sounding idea, and it made the weapons contractors, gazillions of dollars for various proof of concept type projects alone, but it never got off the ground. Why? It was far cheaper for the russians to mix dummy missiles with the real nuclear missiles, than for the Americans to upgrade the missile defense system to deal with the dummies. Say, 5 million additional dollars to add the dummies, 500 million dollars to upgrade the computer systems / add more space based orbiting lasers to shoot the dummies down. It just couldn't work. The only way to reliable rank based on onpage content is human beings, and that is dmoz, and why we had better hope that human maintained directories continue into the future despite all the hassles, favoritism, and general unfairness associated with these type programs. Computer programs are just too easy to defeat. Well, this is what I think about sometimes when I have nothing better to think about... which is a lot Thomas.