I've been using the very helpful web crawl error pages to eliminate most of our site 404's using robots.txt and redirects. The only thing which bothers me is the infrequency of these stats updates. It looks like they update once in a few days (a week?), and always lag for a few days even after an update, e.g. the last stats update was on Nov 15 and it brought the errors up to Nov 11. I don't see any pattern in the update frequency yet, though I'm watching them closely only for the two weeks. If you track these stats closely too, what's your update period?
I also dont see any set time frame. I could be waiting a week or even 4 weeks for an update which isnt really updated because most of the errors are back dated.