One of the domains on my shared hosting account is inaccessible to Googlebot. But my other domains, on the same shared IP are still being crawled without any problems. --- Using Google Webmaster Tools on the "problem" domain... Fetch as Googlebot brings up "Missing robots.txt". And every crawl attempt returns "robots.txt unreachable". The logs show no record of Googlebot accessing the site. --- I moved the domain to a dedicated IP this morning, and the problem went away immediately. What was causing Googlebot to be blocked on that ONE particular domain, but not others on the same IP?
If the other websites using the same shared IP can be accessed it means it has nothing to do with the host. Its a problem with your website. Maybe it was having a timeout.