What percentage of your forum's traffic/bandwidth is consumed by search engine spiders? They seem to banging on mine all the time. Not just googlebot and Slurp but a whole range of others too. I'd even go as far as saying that close to 30-40% of traffic is SE spiders. Is it worth looking into was to cut this down? I notice some of the bots revisit months old threads which seems as though it is a bit of a waste when you have thousands of them.
If they want to be there that much let them I say. Are you hurting from bandwidth charges? Is your site loading slow because of bots? If not then let them crawl. Google Webmaster Tools allows you to slow down the crawl if you are registered. You can also do sitemaps that may help direct the bots to pages you want. And consider using nofollows and/or a robots.txt to restrict them from viewing pages that are useless for them. Bots will click every link on the page and normally that's a waste. Example is the "reply" button which will take it to a register or no permission page. Easier to just restrict it from crawling or adding a nofollow to those type of links. Lots you can do IF you feel they are negatively impacting your site or results.
Let them be! I also get crawled a lot by them but I also get a lot of Google traffic so it pays off. Never try to limit a search engine on your site, as it could affect you rankings