Off the top of my head: 1) Prevent hotlinking (you can block this in .htaccess files) 2) enable compression for your scripting language (php, etc) 3) use smaller images or try limit image use for layout, etc
host images on other sites and then add to ya site or if u host videos try to upload to youtube or megavideo ect... and embed
Sometimes good robots are also a problem, one of my dynamic sites was getting hit really hard by Google so I used google webmaster tools to slow down the crawl rate.
Ok for google you can do it from google webmaster tools. But for other bots, you can set crawl rate at your robots.txt like: User-agent: * Crawl-delay: 60 It'll slow down the speed with which the bots can index your site. this'll force the bots to pause a minute between every fetch...
You can gzip data on the fly, cache data with expire tags, loading css in their own files cuts down on html output, etc.
Group all your images together and ban all bots from crawling the entire group of folders. The same with all those files that you wish not to be indexed by search engines.
Biggest problem with robots.txt file is that the robot can ignore it if it wants too... So if it's doing that then it'd be best to ban the useragent in .htaccess or the site's .conf file.