Ok, Most of my sites are getting visited by googlebot daily but does anybody knows why googlebot sometimes crawles 300.000 Bytes and in other cases just 10.000 Bytes. PS: It it usefull to use a robots.txt file because I didn't used it till now and I'm getting spidered Regards and thx in advance. Gilles
Spidering is just a random thing. No need to care much about it, as long as you get spidered regularly and all the pages. Robots.txt are mostly only useful to forbid indexing of various pages/urls.
Are there some good 'free' statistics pages where you can see what pages are spidered because my statistics are only giving me this information for the whole site. Thx in advance Gilles