Googlebot has visited my site many times, but the number of files retrieved is 'zero'. same for yahoo slurp. They keep coming, but they get no files. I am not blocking IP addresses. There is no .htaccess file in the directory. This is a fairly new site (about 1 month) but I am redirecting visitors there from an older domain. Why don't the spiders retrieve any files? Am I doing something wrong?
You should Register in Google Webmaster Tools. You will then be able to see if there is any problem with your Robots.txt which prevents Googlebot from spidering your website.
I would advise you to see via Google Webmasters Tools as well to see what the problem is as well. Did you buy your domain from some one else?
Yea this happens alot when the files either give out a 404 or the connection times out on the servers from where the spiders are accessing your site therefore no bandwidth is consumed. Try submitting a sitemap and monitor your site from the webmaster central.
thanks for your advice, everyone. After some investigating, it seems to be the host who is the problem. I have contacted them with a support request.