The people at Google have great benefits, make great money, and feel they are making a difference on the internet, this in my opinion can account for a lot when it comes to employee loyalty and happiness. You would also have to be a fool to give any 1 employee all the keys to the building so to speak. When you have a multi billion dollar corporation like that I would say data security and network security are paramount and therefore you cannot transfer the data out to anywhere using your work computer because they will see exactly who sent what to where and your piece of mind goes in the crapper. I cannot say for sure, but surely it is just not possible to have the entire Google Algo or even a large part of it stored in any one location. If you read their original papers and such, they like that "distributed" approach where they use hundreds and hundreds of machines for a single purpose. This could be a process as simple as one "server farm" doing nothing but calculating pagerank, so in short I think the person that said no one person has all the keys is probably correct. It is just not logistically possible for one person to acquire a significant amount of the algo to do anyone any good.
most of these theories are either 1) shots in the dark or 2) the result of what google has publicly stated. shots in the dark, even when supported by anecdotal evidence, only go so far. it's not a guarantee, not by a long shot. what google has stated publicly is also something that should not be taken as gospel: the google webmaster guidelines, for example, state that acquiring inbound links may be a legitimate way to increase your prescence. they also claim that there's virtually no way you can harm a competitor. yet we've also been told acquiring links in an unnatural manner can trigger filters and delay, if not completely exclude the chance of a site gaining trust.
Correct - Google has very little magic that has not been exposed already and that cannot be replicated with 100,000 linux boxes.
3) Educated guesswork from seeing how the Google algorithm has evolved over a number of years, studying sites that are ranking and predicting logical changes in the algorithm before they happen.
I think you guys give Google way too much credit - are their results REALLY any better than Yahoo or Live?
are their results better? I certainly don't think so. is it more difficult to rank well in google than in yahoo and msn? definitely.
That depends, If you have an old trusted site with lots of existing IBLs, if you start to optimise for a new keyword Google gives much quicker results that Yahoo in my experience.
Google's algorithm has been known for a while now. One of their employees leaked it @ Google Technology
I am still not sure which algorithm you are talking about: - How they store all the data? - How they gather the data? - How they retrieve the data? - How they score a page? - How they present the query results in a certain order? Everyone keeps talking as if there is "one" algorithm, like Coke's secret formula.
I wish Google would spend a little more time using real people so as to apply an "uncertainty principle" or some chaos to whatever algorithms they use to calculate site rankings. There is a whole SEO industry built up around getting high rankings in Google. Much of the advice works. Therefore, much of the algorithm is known. What is not known is the counter-measures Google employs to demote sites which use SEO tricks to push unworthy sites higher in the rankings. I suspect they are a combination of algorithms, complaints from users, and observations by staff. Consequently, knowing those algorithms is not helpful over the long run because of the mathematical uncertainty introduced by humans. As some of you know, Google is the result of a university thesis by its two founders. In years past you were able to read it and it contained some insight into the thinking of Google's founders. Unfortunately, it was a little thin on programmatic details.
I agree. I was referring to a new domain. Phynder: I'm referring to how they score/rank pages in the serps. clancey: if you want to get really technical, nothing's random, not even humans. I understand what you mean though. but I don't agree that it's a good idea-- subject to a lot of abuse, manipulation, etc. though I guess one could argue this element already exists because of DMOZ.
If I had a leaked algo I wouldn't share with anyone - sorry guys, but that would be kept to myself to enjoy the fruits of getting great SERPS!
lol. Your site leaks the exact alogrithm, and you no longer get found on google, so noone knows about it, the site removes it almost imediatelly, they track the IP address, contact ISP, and either sue or fire whoever did it. There are ways to look passed proxies as well
Because even the big ol' matt Cutts himself says he doesn't understand it. It is like Pepsi Cola. None of no how it all works. But each of them understand how parts work. Atleast that is the way I understand it.
First, I expect that only a few people have access to each and every code object. That means only a handful could disclose the whole formula. Second, I expect there are very strict security protocals in place. Mostly though, Google engineers have good reason to be proud of their work and their company.
Of course the biggest reason that the algo has not been made public is because Google does not have a dog named Duke!
I don't believe even if I have the complete list of factors and their exact infloence on the rank I can get rich becase there are some factors that I cannot overcome anyway and these have far more effect than others all together such as Age and Number of backlinks and more importantly the speed that those links added. .