this idea has been done by some large search engines but not quite like this.. A search engines that webmaster can download that crawal and rank pages and stores the data on there site. Then the script that makes the results accesses a list of databases from multiple servers where webmaster have put the script and submitted there database. So if people A-Z put the search engine script on there site it will make a database. Then they use a script that looks at everyone's database, so the large amout of data is spread over many databases. Some search engines leach off of big search engine but not like this, were anyone can join. I guess hacking may be a problem but it could be done in a close source or something. This would make it easier for one search box to search alot of terms. The people could even restrict there database so each one is not sharing the same data... what would the pros and cons of this? would it work? would people that could host a database do it?
Distributed crawl and indexing isn't a new concept. I've seen reports of people working on such things. In addition to the complexities of such a task, you run into the problem of bias. I'd be happy to join your network and inject my poisoned database that has my sites ranking at the top .