save bandwidth

Discussion in 'Site & Server Administration' started by gree124, May 18, 2008.

  1. #1
    how do i save my site bandwidth?
     
    gree124, May 18, 2008 IP
  2. relixx

    relixx Active Member

    Messages:
    946
    Likes Received:
    54
    Best Answers:
    0
    Trophy Points:
    70
    #2
    Off the top of my head:

    1) Prevent hotlinking (you can block this in .htaccess files)
    2) enable compression for your scripting language (php, etc)
    3) use smaller images or try limit image use for layout, etc
     
    relixx, May 19, 2008 IP
  3. kash2k8

    kash2k8 Peon

    Messages:
    374
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #3
    host images on other sites and then add to ya site or if u host videos try to upload to youtube or megavideo ect... and embed
     
    kash2k8, May 19, 2008 IP
  4. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #4
    4) Block the unnecessary or spammy robots by robots.txt.

    You can find spammy robots list here...:)
     
    manish.chauhan, May 19, 2008 IP
  5. relixx

    relixx Active Member

    Messages:
    946
    Likes Received:
    54
    Best Answers:
    0
    Trophy Points:
    70
    #5
    manish.chauhan, that's a good one too :) Thanks for reminding me :)
     
    relixx, May 19, 2008 IP
  6. Dreamerr

    Dreamerr Peon

    Messages:
    914
    Likes Received:
    20
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Sometimes good robots are also a problem, one of my dynamic sites was getting hit really hard by Google so I used google webmaster tools to slow down the crawl rate.
     
    Dreamerr, May 20, 2008 IP
  7. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #7
    Ok for google you can do it from google webmaster tools.

    But for other bots, you can set crawl rate at your robots.txt like:

    User-agent: *
    Crawl-delay: 60

    It'll slow down the speed with which the bots can index your site. this'll force the bots to pause a minute between every fetch...:)
     
    manish.chauhan, May 21, 2008 IP
    relixx likes this.
  8. cmanns

    cmanns Peon

    Messages:
    62
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #8
    You can gzip data on the fly, cache data with expire tags, loading css in their own files cuts down on html output, etc.
     
    cmanns, May 21, 2008 IP
  9. soufulow

    soufulow Well-Known Member

    Messages:
    128
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    118
    #9
    Group all your images together and ban all bots from crawling the entire group of folders. The same with all those files that you wish not to be indexed by search engines.
     
    soufulow, May 25, 2008 IP
  10. relixx

    relixx Active Member

    Messages:
    946
    Likes Received:
    54
    Best Answers:
    0
    Trophy Points:
    70
    #10
    Biggest problem with robots.txt file is that the robot can ignore it if it wants too... So if it's doing that then it'd be best to ban the useragent in .htaccess or the site's .conf file.
     
    relixx, May 26, 2008 IP