That's good advice. Same as the advice about the robots.txt Also, This image is 87KB, and this image is 20KB. I'm pretty sure that you should be able to reduce the size of the images quite a bit without sacrificing to much detail. The second image could be reduced a lot because it has no detail, only colors. As you haven't even bothered to create a robots.txt file after the advice was given some time ago, I think that maybe you don't really care... anyway, good luck. Cheers James
Thanks a lot!!!! I realy dont know what to do (to let google do anything it wants or not). For now I have the bandwidth needed so I wont create a robots txt... (Pages crawled per day) I've seen the crawl rate and the average number of pages crawled per day and its ok its from google
Yeah, I wouldn't worry about it too much. I sort of have a similar problem for the last 5 months or so. I have a search site with about 120 index-able pages on it, the results pages have noindex tags on them but are not restricted by robots.txt My average kilobytes per day used to be around 1000. My average pages crawled per day jumped from around 100 to this... ---------------------- It hasn't let up yet. I considered blocking the results pages via robots.txt to see if it helped but have decided against it for now. As long as the sites not slowing down because of it, I guess I can live with it. Current crawl stats... (keep in mind that this site only has 120 pages of indexed html and a few small images... the entire site, including results pages weighs under 15MB) ----------------------- So don't feel so bad Cheers James
You might want to check your raw log files instead of relying on the numbers of your stats package. Lately there have been a few scrapers doing the rounds identifying themselves as Googlebot.
That's incredibly unfair to you, lots of bandwidth, I wish I could help, but I know some nice people here on this forum will help you.