as of now, G bot already consumed 3.98 GB of my bandwidth for the month. is this normal? its almost 5 days that googlebot hasn't left my site. should I be happy since googlebot is very active on the site?
Two of my sites exceeded their bandwidth limit this month because of this ...... it's way way above any other month, don't know if it will be good or bad as both sites were taken offline and I'm not about to pay the host more just for a one month glitch.
you should be happy , dont do any changes to take off googlebot.. try adding new content as much as you can, you will get everything indexed asap. googlebot, just took off 1 gb bandwith on our domain , although everything is text content Regards,
Wow! Bad strategy if you want to appear in the listings. If your site goes down, and stays down, your site will get zapped from the SE's!
Yeah I know it's a risk but on principle I refuse to pay to upgrade my bandwidth when it may also go over the limit and none of my sites have ever used anywhere near that amount of bandwidth in three months let alone one. Only four days to go
I once saw GoogleBot in the active users of a website, but that might have just been a guy named GoogleBot, but I did know that bot went around compiling sweet data.
You can mek him relax, maybe this line in your robots.txt can help you and don't hurt yoour rankings: User-agent: googlebot Crawl-delay: 180
You can ask it to visit once a week instead of when ever it likes. Just add <META NAME="ROBOTS" CONTENT="index, follow"> <META NAME="REVISIT-AFTER" CONTENT="1 Week"> If it is still useing a large amount like that you will need to upgrade, most hosts will charge you 0.50 per extra 1gb bandwidth.
Oh well, good luck. Thing is, with Yahoo exaggerating the size of their index (19billion pages ... pffft), Googlebot is going to be on a mission to suck up as much data as possible in the next couple of months. If you get delisted because your site is down, it can take months and months and a flurry of begging-type emails to the SE's before reinclusion. Then you have to wait while your site goes through the whole sandbox thing again. If you have a site delisted it can take 18-24 months to get back to where you were before it happened. Adding the time-delay to your robots.txt is perhaps your best option. That, and sucking it up for the additional bandwidth.
any reason for this? I've already added the revisit after tag after on every page. thanks for all your inputs.
my directory's bandwidth also got sucked up by google ... i've exceeded my bw limit 3 days ago ... i paid another 1Gb of bandwidth and 3 days later ... my bw is dried up by google again ...
In modern times, bandwidth must not be an issue. If you're having difficulty coping with the robots then the only thing i can say is to seek a good hosting service than offers a decent monthly bandwidth. Trying to use meta instructions or .htaccess crawl delay can only result in harming you over the long haul. It's a good thing if the bots crawl you, the more the better! If you want my personal recommendations on darn cheap yet powerful shared servers then feel free to PM me and i'll help you
I have to go with the above about getting more bandwidth from a different host. Funnily enough, I'm with a host I use for trying out different things ... wordpress themes, mambo, and the like (don't want to break anything on my main sites). It's the *only* thing I use it for and the site(s) aren't linked to except by a ping when I play about with Wordpress. I've noticed the bandwidth getting sucked from the site by bots. Thank goodness I don't have traffic!