Freshbot went out years ago - but it was when pages were more static and if you changed your site chances are you changed your homepage too and Google would check the date of the index file and use that to see if it needed to crawl. With the rise in database driven sites the index file can be incredibly old with a site that is updated every hour or less. So, with that in mind Google dropped the concept of freshbot and deepbot and just unleashed a stack of googlebots. They use xml sitemaps to try to make it easier for us to notify them of new content and them to see that it's there but xml sitemaps aren't widely adopted (not really). Most of all googlebot looks at how often it finds changes. If it's often then it'll schedule to revisit more frequently, if not, then it might make the visits less frequent.