It it possible to tell a bot to only spider a directory once a month? I've see meta tags that say revisit in 30 days but bots come daily anyways. So can you tell them when to come and when not to come?
Problem. Google and other bots get stuck in my calendar eating up 2+ gigs of bandwidth a few times a month. I only get 5 free gigs which is plenty w/o this issue. However, I appear to have lots more pages when they do go though my calendar so, the hard way, is to remove my robots.txt file until it spiders and then putting it back up once a month.