Well, since it appears that the big G is now bouncing API results all over the place, just like the rest of the operators, maybe some kind of moving average would be better to calculate base weight than a single api call? Rather than everyone logging in and pumping the revalidate button a few times per account I figured something else would be a better alternative. Maybe G will just mellow out in a week or two.
(Let me try a newbie question...) Do they (Y+M) have site: commands? And can they be accessed through an API? If even one answer is 'no' then it wouldn't work.
An API for Y! has been talked about for ages... Hopefully MSN will be a little quicker in delivering one after the SE has been released.
Else the system has to scrape the results for all the 1000's of sites in the coop network. That would make Shawn look bad...
Yes I see your point - thats a lot of searches particularly as the network grows. Thinking about it there must also currently be a lot of searches on G given that the sytem validates against indexed pages only. However, I find that the coop network works very well in Y and M beta and Y is much more difficult to get yourself spidered so it would possibly be a fair way of establishing weight if this issue could be overcome. I guess there is also an issue of whether a site is banned by G - although I guess this also works in reverse as well.
Albeit seemingly inconsistent, G at the moment is the most transparent too. We can all use the commands that Shawn's system uses to verify Shawn/others aren't cheating and we're all bound by the exact same rules we all know. That's the beauty of G and the API. When the parameters are taken from a combination of SE's we'll get even more questions about how the coop works but then can't answer them anymore.
yeah, I am not advocating using Y! and MSN. I think that some kind of moving average may stabilize the network as well. It can't be good for a site to have 6k BW one day and 67.5K the next. Using Y! and MSN is clearly not feasable.