Hello, i really need your help. I am running out of bandwith. Every day, the whole day about 90 robots from google scan my site. The site is running on Mambo cms and i have created a robots.txt User-agent: * Disallow: /administrator/ Disallow: /downloads/ Disallow: /bridges/ Disallow: /cache/ Disallow: /components/ Disallow: /editor/ Disallow: /help/ Disallow: /images/ Disallow: /includes/ Disallow: /language/ Disallow: /mambots/ Disallow: /media/ Disallow: /modules/ Disallow: /templates/ Disallow: /installation/ Is there anything i can do to optimize my robots.txt file in order to save some bandwith? Google is using every month approximately 6,5 GB bandwith and i have limited amount to 15GB monthly. I have searched a lot in this forum and other forums but no solution at all. I appreciate any help. Thank you
if i turn google crawl to slow, can i change it again to normal on 1st of next month or i have to wait for the 90 day period? Please explain Thanks
I would make absolutely, 100% sure these are REAL google IPs and not fake ones. Google generally doesn't crawl that fast. I think the fasted I have seen it crawl is like 5-10 pages per second. If your site cannot handle that, look to upgrade, or tweak. I think it could be likely it is a spammer or content scaper, or just some rogue bot pretending to be google, unless you have carefully resolved the IPs. Disallowing things is bad, you want to get indexed. Bad bots just ignore robots.txt. But there are many many ways to stop bad bots. Hope that helps.