Basically my problem started when I decided to use the "Global Translator" plug in for wordpress, that allowed google translation. it all was great until my account got suspended for overheating the CPU, google was crawling teh site with all the content translated to 13 languages TOO often. moving to dedicated didnt even help because google was still hitting the server so hard that it resulted in timeouts half the time (on my other sites). I added database cache plug in to my blog, and set the translator to cache every 8 hours (default is 3), however, sometimes it says "translation engine is unavailable" (while it does work on other blogs). It seems to work a few days after installation but then it says translation unavailble, could it be that Google blocks me ? My question is.. how can i prevent google from crawling my site so often? I have in the meta tags but it's obviously getting disregarded when it comes to crawling the translated pages as they dont have them. any ideas?
The "revisit" <meta> tag is not supported by any search engine. You can request that Googlebot reduce its crawl rate through the Webmaster Tools console. See http://www.google.com/webmasters/ if you haven't signed up for this service yet.
rainborick gave you the correct information. In Google Webmaster Tools you can set the crawl rate to "slow." You may also want to get the XML sitemap plugin for Wordpress. I'm not a blogger, so I am not sure if the auto generated sitemap lists the last modification date. If it does, that may help direct Googlebot to your new pages instead of continually indexing the old.
Put this at the top of your robots.txt: User-agent: * Crawl-delay: 5 This will slow the crawl rate down of all bots - I'm sure thought that it can also be used specifically just for Google bot.