Really? Strange - no one in my family or my friends or colleges know what PageRank is at all. Btw, by default (pre-installation on many computers like Dell, HP and Acer) the PR display is disabled in the Google Toolbar - so I think 99% of the users have never seen that little green bar.
People (may be non techie), who search with google and then buys products online, do use google toolbar I think.
Thanks for your authoritative post, glad to know what I say isn't true. Welcome to the forum. Would Google look at completely different parameters when evaluating pages and sites for SERPs, PR, quality score, ad pricing, etc? No. Is mindshare or brand worth something? The answer is yes. Remember Google is one hand clapping, figure out who the other hand is.
No Google would look at different parameters in case the PR is invisible. But the PR currently is the only parameter webmasters know of by sure when looking at a single web page. We have nothing left if we cannot see the PR any more. regards
There are other parameters or results of those parameters like minimum bids in Adwords, SERPs, image search, etc.
Actually that makes some sense .... -Yahoo held warrants to purchase 3.7 million shares of Google under an agreement struck in 2000. -Google issued 1.2 million shares to Yahoo in June 2003 pursuant to a conversion provision in the deal -In 2004 Google agreed to issue 2.7 million shares of common stock to Yahoo! in exchange for a perpetual license for the Overture pay-per-click and bidding systems patent
It was murder going through this big ass thread to only find 2 people who commented on the actual question. PR sucks but brainstorming about how to build a new (and better) system is fascinating. Although it would have to be attached to a larger site (like digg or dp [hint, hint]) for it to gain any momentum at all. The above projection is the most realistic I have seen in this thread but I think still a bit short. Bandwidth won't be your problem, I don't think. The biggest issue will be your database and how to effectively sort through the millions of records on demand. Google claims to have old & cheap CPUs "robot" servers that crawl the web for them. I think the easiest way would be to have a network of 20-30 separate servers that crawl and index links only. From there they can just report their findings to 2-3 2nd tier servers who report daily to the main one. I'll donate some time and a server (shared only, sorry) or 2 to the cause, but I'm honestly not that interested in just re-inventing the wheel. I think there might be a better way of doing PR instead of just straight # of links which can be easily corrupted. This is the exact problem google is having with it. As far as PR is concerned ... I say dump the public version and keep it internal. After a year or 2 it will actually be worth something for a change.
Thanks - this was the first reply to my thread which contains useful suggestions! I will set up a homepage and a forum for the project so it can be made more popular and discussion can be more conccrete. The aim of the project is step 1: crawl the net in order to calculate the value of a single web page (should be a value which is relevant to the Google SERPs) step 2: when step 1 works well it may grow to a real search engine I think the actual database is not the problem - I was project manager for projects with databases bigger than 500GB and implemented a full text search in it. For step 1 it can be handled easily. It's more difficult for step 2 because of the speed (because of many aspect like index size, location of the server etc.) Just because of "marketing purposes" I would call it something like "Open Search Engine" - this is more interesting than "Project for calculation of a web page value". I am thinking of indexing by "shared ressources" - much like the SETI at home project... What do you think?
Now this sound actually interesting. If you really do come up with a way to do it, I'm sure to install a client, and keep it running for some time. Too bad "Open PageRank" would get you sued.
Well, not really. 100 GB bandwith a month is not expensive - for step 1 the server does not have to be very fast at all. And if a "shared resource crawling" can be implemented the costs are quite low. Though, it's some work to implement it... It's the same with all "Open" (or Open Source) projects - it's like Linux and MySQL - they put a lot of work into it and finally they got a return... I think this would be something like the first open web project?
Actually it is updated live daily for Google employees. Katulago You did not receive a serious reply due to your choice of title. It is nobody's fault but your own. If you did not want all the BS responses... a better title would have been. "I want to compete with well known search engines" or "looking for others to build new page citation search engine" But when you chose to use hysteria in your title and to draw attention then you should be cognizant to the fact that you would receive hysterical replies... Just my 1.5 cents adjusted for inflation.
Yes, you are absolutely right. I may start a new thread with a better title. Though, I do not want to compete with well known search engines - actually the intention is just to calculate something like a PR replacement. (building a SE is step 2 which is not in focus right now)
Thank you and good luck with that however your DocRank (free to take and use) will be competing with PageRank lol Peace!
I'm in, PM sent in near future and you have your 1st project developer. I would suggest a very thorough game plan be in place before the 1st line of code gets written. This ranking system would have to be better than what is currently in play with google. Actually, that shouldn't be too hard as PR is just related to the sum value of links across the net. Bad communities are blocked but that might not be a PR thing as much as a search engine thing. Going sourceforge would align us with 100's of top notch developers. Keeping it open source would allow our ranking system to be used by any search engine that wanted it. I think nobody uses PR in their algo currently (except google) ... at least not the internal one which is constantly updated. I'm sure that smaller engines like gigablast & momma as well as the multitude of meta search engines like jux2 & QueryThe.Net (shameless plug) would be happy to implement this type of ranking system in their results. On a side note, we could also include # of links as separate data that could be extracted from this project since we will already have this in our db. Can't wait to help out getting this off the ground.
I would do that only in case Google does not export the PR anymore - so it does not compete... "DocRank" is ok, but I thought of something more "powerful"... what about "PagePower"... anybody have a cool name?
That's like saying that Linux has to compete with Windows. Searching is a big ass pie and there are enough pieces to go around. Seeing that PR is hoarded by G and not shared, suppose this open version is picked up and used by yahoo!, I guess you won't be loling then. Remember, there is about 35% search engine market share that is not associated with PR in any way, shape or form.
Great to get more quality feedback. Thanks! It's not the time to write any line of code yet - you are right. Nobody can use the PR algorithm because it is protected by patents. So smaller search engines might be happy to get a good algorithm that works. I think this would be ok - we defenitely need some supporters (developers, webmasters and probably sponsors) to get this going. I sent some detail by pm. The first step is to find a good name for it... any ideas?