My site was offline for most of today (growls at webhost) and i've randomly checked the cache of some of my pages and they seem ok. How long does it take for Google to update the cache on the pages so I know if any damage has been done? Sarah
My sites typically get crawled every other day. If crawled today, Dec 4th, I may see the fresh date late Dec. 5th, but usually it is Dec. 6th before I see it. Others could be on different schedule to mine. Shannon
usually when its a real webhost problem, google gives it a day the benefit of doubt, and usually just acts like it didnt see any changes that next update. So usually when it is a real webhost problem you got an extra day?
Well, lets hope Google picked up on the error messages. So far it looks like it * didn't visit; or * didn't cache Now to see when it tries again... thanks for the replies Sarah
I haven't had ANY down time with my current host since switching about a year ago or so (last November?). With my previous host, I did have such problems -- not unusual to have the site down for a few hours once or twice a month. I never saw it affect my rankings though and Google kept coming back. I think if you were down several days in a row you might have more reason to worry. I also don't know what headers would be returned to Googlebot with a site down rather than non-existant -- server error? If so, I would assume Google can tell the difference between a site that has moved, a "no such site here" message, and a "cannot find server" scenario...
Yes, it would take a few days, maybe even a week or more for it to really hurt you. Less than a full day - you should be fine. Added: I've used tons of different webhosts, many really really bad ones, so I've had lots of downtime with various sites. The most frustrating downtime (in terms of SEO) was during a PR update, and while everyone was getting all excited about seeing their PR go up, I couldn't even check to see what mine was.
It will be fine if the service was unreachable. If however the service was reachable, it comes down to duration. The only time you will see problems, is if G went after a deep page, and your host was using a redirect to dump ppl at a common error page (which, stupidly, (unless the admin has thought about it) hands Gbot a 200 OK). Even after the fact, if you know which pages they were, deep link to them from the homepage until Gbot recrawls (if it's not too many - otherwise link to primary levels in your sitemap). Cheers, JL