I am assuming you own this search engine as the addresses are so very close to one another in Illinois? Just curious.
As long as search functions like Scour are using api's to produce search results from Ask, Google, and Yahoo ... then the results will always be the same old dull, boring, and irrelevant results .. in essence .. you are still searching Ask, Yahoo, and Google .. the only new thing here is the wrapper that the results are being presented in. -- If you really want to start something new, complete with fresh and clean search results .. then write your own parsing agents, establish certain rules for indexing/ranking, and then provide yourself plenty of server space to handle your crawls of the web .. Reinventing the wheel through the use of api's only serves to reproduce unfair listings across the web.
Large companies rarely ever revisit older, more proven, technologies. The trick to getting better results in many cases, is the actual review, by human eyes, of the content that's being presented. Large companies, like say, Google, for instance, won't be so willing to spend the money on actual human intelligence to assist with their index listings. Engines like Google have become too dependent on automation. Quality control cannot be done with a machine. Parts and pieces have to be looked at and inspected. Google has relied so heavily on Googlebot filters that it appears that those very filters may be the eventual undoing of Google itself. Googlebot gets maintenance, but only with regard to the bot itself, and rarely, if even ever, are human eyes involved in the process of looking at the sites that Googlebot brings in. A certain level of quality can be achieved if these search engines would quit indexing anything and everything they come across in their inane race to see who has the most pages indexed ... If these search engines would parse and index only the pages that mattered, and separated them from all of the internet trash that's out there, you might see a great improvement in the SERPS overall.