Question about Googlebot and getting spidered

Discussion in 'Search Engine Optimization' started by blade69, Jan 12, 2006.

  1. #1
    Ok,

    Most of my sites are getting visited by googlebot daily but does anybody knows why googlebot sometimes crawles 300.000 Bytes and in other cases just 10.000 Bytes.

    PS: It it usefull to use a robots.txt file because I didn't used it till now and I'm getting spidered

    Regards and thx in advance.

    Gilles
     
    blade69, Jan 12, 2006 IP
  2. subseo

    subseo Guest

    Messages:
    652
    Likes Received:
    38
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Spidering is just a random thing. No need to care much about it, as long as you get spidered regularly and all the pages.

    Robots.txt are mostly only useful to forbid indexing of various pages/urls.
     
    subseo, Jan 12, 2006 IP
  3. blade69

    blade69 Peon

    Messages:
    44
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Are there some good 'free' statistics pages where you can see what pages are spidered because my statistics are only giving me this information for the whole site.

    Thx in advance

    Gilles
     
    blade69, Jan 12, 2006 IP
  4. jackburton2006

    jackburton2006 Peon

    Messages:
    5,296
    Likes Received:
    282
    Best Answers:
    0
    Trophy Points:
    0
    #4
    A decent hosting company should give you this for free. If yours don't, move.
     
    jackburton2006, Jan 12, 2006 IP