I have a few sites that are not in Google's cache yet, but I leave them in the keyword tracker. Would it take a little load off the server, and save a few Google requests per day to first check if the page is in a "Site:SiteName.com" result first when the page was first entered? Just a suggestion, I'm not sure if the payoff is worth the change.
Probably not worth it to be honest. The overhead of the extra query would probably just cause more total queries.
I was thinking that each site entry would be flagged as having succeeded a "site:" lookup. This way it is only called for sites that have never been found in the index. If you track 40 urls, and one of them is not in the Google index, then you will be issuing 20 requests for that one url versus 1. If the SERPS for the the other 40 URLs are in the top 200 and haven't moved, then about a 3rd of the requests could be avoided. If there are more sites that aren't in the SERPS that you are looking up, the problem is compounded. I'm not trying to advocate a change, just having a dorky debate about it. I'm curious to how many sites are looked up that aren't indexed by Google.