I'm optimizing a site, and in the process I'm moving css and image files off to a http server purposed for serving small static files. It has occurred to me that it would be best to simply deny access to all files on my static server - that way those servers are only burdened with serving content to real visitors. The thought occurred to me that there may be a penalty for denying access to those files - since you could potentially be hiding links in css. Does anybody have any thoughts in regards to denying bot access to css / javascript / image files? I can see potentially allowing access to image files so that they would show up in image searches, but that's about it.
I would use robots.txt and hope the bots follow your directions. It doesn't block access if they choose to ignore your rules, so if they think it's suspicious I guess they'll still check it out.