I really need your help

Discussion in 'robots.txt' started by othonas, Jul 19, 2007.

  1. #1
    Hello, i really need your help. I am running out of bandwith. Every day, the whole day about 90 robots from google scan my site. The site is running on Mambo cms and i have created a robots.txt
    User-agent: *
    Disallow: /administrator/
    Disallow: /downloads/
    Disallow: /bridges/
    Disallow: /cache/
    Disallow: /components/
    Disallow: /editor/
    Disallow: /help/
    Disallow: /images/
    Disallow: /includes/
    Disallow: /language/
    Disallow: /mambots/
    Disallow: /media/
    Disallow: /modules/
    Disallow: /templates/
    Disallow: /installation/

    Is there anything i can do to optimize my robots.txt file in order to save some bandwith?
    Google is using every month approximately 6,5 GB bandwith and i have limited amount to 15GB monthly.
    I have searched a lot in this forum and other forums but no solution at all.
    I appreciate any help.
    Thank you
     
    othonas, Jul 19, 2007 IP
  2. andriusk

    andriusk Well-Known Member

    Messages:
    367
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    140
    #2
    andriusk, Jul 20, 2007 IP
  3. othonas

    othonas Peon

    Messages:
    75
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Thanks. I am not so familiar with that kind of stuff.
    I'll try it.
    Thanks again
     
    othonas, Jul 20, 2007 IP
  4. othonas

    othonas Peon

    Messages:
    75
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #4
    if i turn google crawl to slow, can i change it again to normal on 1st of next month or i have to wait for the 90 day period?
    Please explain
    Thanks
     
    othonas, Jul 20, 2007 IP
  5. nddb

    nddb Peon

    Messages:
    803
    Likes Received:
    30
    Best Answers:
    0
    Trophy Points:
    0
    #5
    I would make absolutely, 100% sure these are REAL google IPs and not fake ones. Google generally doesn't crawl that fast. I think the fasted I have seen it crawl is like 5-10 pages per second. If your site cannot handle that, look to upgrade, or tweak.

    I think it could be likely it is a spammer or content scaper, or just some rogue bot pretending to be google, unless you have carefully resolved the IPs. Disallowing things is bad, you want to get indexed. Bad bots just ignore robots.txt. But there are many many ways to stop bad bots.

    Hope that helps.
     
    nddb, Jul 21, 2007 IP
  6. corrine88

    corrine88 Peon

    Messages:
    54
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Thanks. I'll try it.
     
    corrine88, Jul 21, 2007 IP
  7. corrine88

    corrine88 Peon

    Messages:
    54
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #7
    Thanks. I would make absolutely
    I'll try it.
     
    corrine88, Jul 21, 2007 IP
  8. trichnosis

    trichnosis Prominent Member

    Messages:
    13,785
    Likes Received:
    333
    Best Answers:
    0
    Trophy Points:
    300
    #8
    arranging google crawl speen is a better option on google webmaster tools panel is a better option
     
    trichnosis, Aug 6, 2007 IP