Server getting overloaded..

Discussion in 'Site & Server Administration' started by Proximity, May 23, 2008.

  1. #1
    Hey

    I own a filehosting site and i have had large files uploaded to the server (300-700mb) the server runs fine, but sometime it gets really slow and the site cannot be opened. I guess, its because the amount of people downloading?

    I heard that download accelerators can really harm the server as they open alot of new connections with the server..

    Well i am looking for some method that i can offer large files, and still keep the site online at all times.

    I own my own dedicated server, with average CPU and 1gb ram.

    Any suggestions?

    Thanks.
     
    Proximity, May 23, 2008 IP
  2. rootbinbash

    rootbinbash Peon

    Messages:
    2,198
    Likes Received:
    88
    Best Answers:
    0
    Trophy Points:
    0
    #2
    you should think about server cluster
     
    rootbinbash, May 23, 2008 IP
  3. xous

    xous Active Member

    Messages:
    173
    Likes Received:
    11
    Best Answers:
    0
    Trophy Points:
    60
    #3
    Hi,

    The first thing you should do before going out any buying anything is to take some proper metrics and determine where the problem actually is.

    I would suggest you install something like MRTG http://www.mrtg.com/ to monitor server load, disk usage, etc.

    I also suggest that you install some software to do some statistical analysis on your web logs (you do keep logs, right?) to determine what people are hitting the most.

    Does your file hosting script use MySQL? If so you should also check this out.
     
    xous, May 23, 2008 IP
  4. hyjlmc

    hyjlmc Peon

    Messages:
    363
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #4
    you can find the script which can limit the download accelerator.I'm sorry for that because I had forget the exact name of the script.
     
    hyjlmc, May 27, 2008 IP
  5. mellow-h

    mellow-h Peon

    Messages:
    750
    Likes Received:
    14
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Are you using apache web server? Use the following apache settings for Prefork:

    StartServers 20
    MinSpareServers 20
    MaxSpareServers 150
    MaxClients 256
    MaxRequestsPerChild 500

    And make sure you set your KeepAliveTimeout to 1 or may set KeepAlive Simply off. As you are serving Big Files, it would be better to set this to Off.

    This will eventually help you to bypass larger files using apache. In these configuration, your apache will run with bigger resources to load. I also do recommend you increase the memory Size if you are carrying over 300 Requests per second. Try counting your result from the following few commands:

    netstat -plan|grep :80|wc -l
    Code (markup):
    netstat -plan|grep :80|grep TIME_WAIT|wc -l
    Code (markup):
    Hope you will do the best in your business ;)
     
    mellow-h, May 30, 2008 IP
  6. jeromespitfire

    jeromespitfire Member

    Messages:
    69
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    41
    #6
    increase MaxClients to something like 700
     
    jeromespitfire, May 31, 2008 IP
  7. Orien

    Orien Active Member

    Messages:
    593
    Likes Received:
    12
    Best Answers:
    0
    Trophy Points:
    60
    #7
    That would just cripple the server even further. Don't give advice if you don't know what you're doing.
     
    Orien, May 31, 2008 IP