Hi All If you've heard of Adam Lasnick You'd better read these threads http://www.webmasterworld.com/google/3416514.htm http://forums.digitalpoint.com/showthread.php?t=431107
This same argument has been rehashed dozens of times. Gooogle will probably keep the toolbar page rank because it does bring them lots of pulicity. I think it would be great if it updated monthly to give a more realistic representation and I think gooogle will go that way and not get rid of it. Also centime why would you start an identical thread over here? It will probably get merged and get you a 'red'.
Why does google show PR on toolbar? Simple : Each time you visit a page, your browser sends google a request - therefore you are giving google more traffic/stat data then alexa will ever get The toolbar PR is a simple trick to find which sites are popular.
mikey1090 they could still get the exact same info from you via the installed toolbar without showing page rank.
From the Adam's post - it seems he wants suggestions for remove PR bar from Google toolbar only.. ( and not removing the PR algorithm)
you lost me with this post ? anyway even if they remove the whole pr or visible pr, we always can write an alternative tool (with a tool bar too) that uses the same algo and parameters to calculate (or at least estimate) the PageRank! its not a big deal. or may be if PR is gone, Page strenght will be a better alternative, PS use more quality factors than PR to determine the "page importance" like (traffic, domain age, edu and gov links, page visibility..).
G should try to solve the issue with TBPR rather then removing it ... That's not a solution to the problem.
actually, there is no accurate tools because of the cost and expenses of developping and running such a poweful tool on a server because: an accurate tool must browse all the backlinks from the yahoo explorer or google webmaster tools, then calculate the amount of link juice passed from each page based on the pr, outbound links.. (eliminate pages from the same domains, nofollow links, and hundred of other parameters and factors), but this is not a problem, we can write a tool like this and considere all the parameters and the factors that (we are aware of) to write a very good and accurate tool. the problem is, some websites have thousands of links (or millions), imagine the amount of resources needed to run just 100 pr estimations queries at the same time... i think its very possible to write a very good pr estimation tool (75 to 90% accurate), but its a huge investment and untill now no one is ready for it. (i am seriously thinking about writing one and am actually very close to find a solution for the server resources problem) just stay tuned
I am not sure gooogle sees any problem...they get 1000's of forums posts and blog posts every few months in anticipation of the toolbar update. There is no accurate tool because no one knows the algorythm. I guess 75%-80% accuracy is possible but that is not very good and probably at par with what is out there now (I would say iwebtool's prediction would prove accurate 65-70% of the time).
I use the google toolbar because of the Pagerank mainly .. if they remove it i will have weak reasons to keep it installed .. and probably im not the only one that would just dont need it anymore.
there is no accurate tool at all now, iwebtools is crap, an accurate tool would use the yahoo explorer links and browse the backlinks one by one, then considere all the factors that we are aware of to calculate and estimate the pr (80-90% accuracy is possible). an accurate tool will act exactly like a Spider (google bot). its the costs of running a tool like this that are a problem. imagine a tool browsing millions of pages to count the number of outbound links per page? you will need some huge resources and extermely fast servers just to do just 1 pr estimation in a reasonable delay, never mind 1000 webmasters using this tool at the same time, i explained it all in my first post.
What a visitor needs? A good quality website. A quality website can be identified only by directories and not by search engines. If a directory has listings in more than 200-300 directories (Both Free and Paid ) Then the site would be of definitely of higher quality than the site than got listed in 100-150 directories. Neither the backlinks or PR or Alexa rank tell what is the quality of the website It is only the directories which indicates the quality of the website.
no offense but this is the most innacurate quality factor and theory that i ever heard. (more directories listings = more quality). the number of directories listings cant determine quality because numbers can be bought . backlinks are the most accurate quality factor, (when you have good and quality content, you get more backlinks naturally) also backlinks from edu and gov websites are a good sign of quality, TRAFFIC is also a very good sign of quality (Alexa is not 100% accurate but it gives you an idea). your theory need a serious re-thinking freewebspace!
Not an inaccurate factor. It is an more accurate factor than backlinks. I am talking about Free and Paid Quality directories. Also I do n't want to reveal anything what I am doing
My intention was 2 fold to focus this particular thread on the directory operators point of view, and to inform people who probably only lurk in this forum. And, the discussion is not the same as previous webmaster only discussions, things happen when they take that particular format, There is a Google rep participating. It might not be now, but for a large company, Google is extremely agile on its feet.