Time between Crawl and Cache?

Discussion in 'Google' started by axemedia, Feb 15, 2007.

  1. #1
    Does anyone have a general idea of how long it takes Google to add a recent crawl into the cache?

    Just noticed I got a deep crawl yesterday on one of my sites and waiting to see what's going to be new in the cache.

    "how many pages of my new directory already got indexed?", "did those problematic pages get lifted out of supplemental, yet again?"

    These are the types of questions I pose to the bot Gods (google species).
     
    axemedia, Feb 15, 2007 IP
  2. Anita

    Anita Peon

    Messages:
    1,142
    Likes Received:
    51
    Best Answers:
    0
    Trophy Points:
    0
    #2
    I'd guess this would depend on your PR. I've seen pages from my blog crawled with 5 minutes, and added to the cache within 30 minutes, of posting (I autosend updated sitemaps after each post).

    Anita :)
     
    Anita, Feb 16, 2007 IP
  3. KingofKings

    KingofKings Banned

    Messages:
    5,975
    Likes Received:
    143
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Ya with a google site map, the crawl happens faster..
     
    KingofKings, Feb 16, 2007 IP
  4. mpls-web-design

    mpls-web-design Well-Known Member

    Messages:
    212
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    108
    #4
    Google sends out its spider to find out new and fresh contents. So, the more fresh contents your site has, the more frequently your site will be visited by spiders. Blogs that are updated constantly are crawled very often. Forum sites like Digital Point are crawled even more frequently.

    The higher the PR, the more crawling too. Spiders follow links and high PR = more backlinks.
     
    mpls-web-design, Feb 21, 2007 IP
  5. stocktrader

    stocktrader Member

    Messages:
    26
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    38
    #5
    how do you know if you get a deep crawl? other than seeing googlebot 200 times in a row in your logs
     
    stocktrader, Feb 22, 2007 IP
  6. axemedia

    axemedia Guest

    Messages:
    1,070
    Likes Received:
    79
    Best Answers:
    0
    Trophy Points:
    0
    #6
    My server stats, from cpanel, showed me that googlebot visited every page on the site. Showed the URL's it visited.

    Funny thing is though, I added a XML site map and submitted it to G. Since then I noticed that every page of my site has been slowly being dropped out of the index. A few less every few days.

    I have no Idea why.

    Started within a few days of submitting the site map.

    I was having an issue with my host, and was getting a lot of server timeouts for a couple weeks, off and on. Site was not fully down over that time just intermittent disruptions. A bit more than the usual you get on a shared hosting plan.

    I notice in Google webmaster tools, that G did try to crawl a few pages and hit it at a time when the server was not responding. G shows an error message. Not sure if this is why its dropping pages, or what.

    I dont know what else to do other than wait for G to sort it out.
     
    axemedia, Feb 23, 2007 IP
  7. my44

    my44 Peon

    Messages:
    722
    Likes Received:
    24
    Best Answers:
    0
    Trophy Points:
    0
    #7
    Crawls happen almost everyday, at least that what the stats in Google Sitemap say. But to me, cache is a different story altogether. One day you can see your website is 100% cached across various DCs with firmed PageRank, the next day you see "Cached=No" status with zero PageRank.
     
    my44, Feb 24, 2007 IP
  8. zoom

    zoom Peon

    Messages:
    1,244
    Likes Received:
    29
    Best Answers:
    0
    Trophy Points:
    0
    #8
    I think within 48hrs your pages should be cached...
     
    zoom, Feb 24, 2007 IP
  9. QuEST

    QuEST Peon

    Messages:
    98
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #9
    I have a similar question, i noticed, googlebot is crawling my site for 2 days now, the stats show it crawled about 1000 pages, but it is not reflected in the index.. My site is huge with more than 30000 pages.. is it the reason why the pages are not yet indexed? as it will wait to complete the site or something? same goes with Msnbot..

    since google found my site, i did not bother to submit the sitemap even though it is ready.. will submitting the sitemap make a difference?
     
    QuEST, Feb 25, 2007 IP