I'm curious if anyone knows of any spiders that support HTTP 1.1? I know Googlebot doesn't, and neither does Yahoo's Slurp bot. Both are HTTP 1.0 only. It seems like search engines spiders would want to be HTTP 1.1 compliant, as it can cut down on bandwidth costs for them tremendously. One of the nice things about HTTP 1.1 is it can support gzip compression for web pages. Text files (HTML files) typically compress down about 5 to 1, so it would save quite a bit of bandwidth. Of course the real reason I want it is because *I* want to save on bandwidth when the spiders visit. - Shawn
Something I overlooked this morning when posting about the new Googlebot, is that the new test Googlebot described here, is now running HTTP 1.1, so it looks like it's more than just a JavaScript reading upgrade. - Shawn
Shawn, you can also check this website out.... It will give you tons of information on the different bots out there as well as track them when they crawl your website. http://bots.pcpropertymanager.com/index.php Hope it helps Wayne
do you think they aren't adding the support for it because of the CPU overhead on their end to decompress the gzip'd pages?
Doubtful... I imagine they will support it at some point. The tiny CPU overhead would save them 80% of their bandwidth.