That is an awefull lot of bandwidth for slurp! Anyhow a crawl delay works fine in your robots txt I have used it for months on a fe sites.
a client of mine had the same problem and he went over his alloted bandwith with slurp...about 30% of his bandwith was taken by slurp spiders.... I did search and edited the robots.txt and it took a couple days but its working Yahoo Slurp consuming bandwith