I see that the Googlebot crawls my website starting at 2 AM every night. Does it have a set amount of time allocated to each website as to how long it will crawl for? I would think that faster websites would have more pages crawled than slower websites. What determines frequency of visit and number of pages crawled?
The frequency of the bots visiting a page is purely based on the PageRank and the content freshness (how often the page was updated)
KC TAN, how can you reach this conclusion? Or can you provide the link where I can find similar articles?
Although this has not been officially released by Google, but most people have mentioned that the frequency of the bots visit is based on: 1) PageRank 2) Content Freshness You can do a search in DP to find many similar threads. In addition, SEO expert, Dave Taylor also mention this point in his book, Growing your business with Google
I would imagine that the rate of of accumulation of backlinks will also play into the crawl frequency. Of course some might argue that this also bestows PR, but I'm of the mind that, regardless of PR, if your site gets a whole bunch of natural links Gbot is going to come knocking on your door. Content freshnees is an ex-post criteria (Google cant tell this until after it's visited your site). Backlinks will be anti-post (Google knows this before it has crawled).
I have about 30 backlinks. And its a forum, with fresh content every minute. I have 8,980~ pages in my sitemap, and get crawled by about 1,400 pages a day. Theres no set time it starts and stops, its pretty much on the site all the time. (Well not quite, there is 80,000~ seconds in a day, so its there for 1/80th of the day). If you go to shawnhogansfanclub blog, he has a picture of sitemaps up, showing google averaging around 50,000 pages a day, and peaking at 78,000 or so. So it depends on freshness and size, PR definatly. PR, because I have a site, with 200 pages, 2 years old, 5PR, no updates since forever, and it gets 12,000 pages crawled a month. Pierce