Command for finding directory using most space on server

Discussion in 'Site & Server Administration' started by dfsweb, May 1, 2007.

  1. #1
    Hi there,
    I have a virtual dedicated server with Godaddy. I don't have Plesk control as I just use Simple control panel. I do have root access though and was wondering if there is a simple command that I can run to view a summary of disk space usage for all my directories?

    I am currently using around 80% of allocated server space and want to find out which websites are using the most space ... And just wondering if there is a simple command that I can run to find this out instead of individually running ls commands for all directories ... Also, I don't know whether the majority of space is used by my files or databases or by mail queues etc.

    So in short, I would like to log into my server (switch to root access) and run a command on the root directory to give me a summary of disk usage for all my folders. Is there such a command?
    dfsweb
     
    dfsweb, May 1, 2007 IP
  2. Mr_2

    Mr_2 Peon

    Messages:
    980
    Likes Received:
    21
    Best Answers:
    0
    Trophy Points:
    0
    #2
    it should be

    du -sh
     
    Mr_2, May 1, 2007 IP
  3. dfsweb

    dfsweb Active Member

    Messages:
    1,587
    Likes Received:
    55
    Best Answers:
    0
    Trophy Points:
    88
    #3
    Hi there,
    Thanks for the reply. I tried this command but it only returns one value ... in my case 9.1GB. I would like output like this:
    Folder 1 - 20MB
    Folder 2 - 30MB
    Folder 3 - 8.9GB

    So that I know which one is the main folder ... then I can drill down to that folder and run the same command again and so on until I find the culprit directory that's using up all my disk space.

    Regards,
    dsfweb
     
    dfsweb, May 2, 2007 IP
  4. rootbinbash

    rootbinbash Peon

    Messages:
    2,198
    Likes Received:
    88
    Best Answers:
    0
    Trophy Points:
    0
    #4
    -S

    for more
    #man du
     
    rootbinbash, May 2, 2007 IP
  5. matrafox

    matrafox Active Member

    Messages:
    164
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    88
    #5
    try in the current directory to run
    du -h --max-depth=1
     
    matrafox, May 3, 2007 IP
  6. LittlBUGer

    LittlBUGer Peon

    Messages:
    306
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #6
    I usually like to use this:

    du -cks * |sort -rn |head -11

    :)
     
    LittlBUGer, May 3, 2007 IP
  7. dfsweb

    dfsweb Active Member

    Messages:
    1,587
    Likes Received:
    55
    Best Answers:
    0
    Trophy Points:
    88
    #7
    I used the du -Sh command and eventurally found the culprit ... It is an error log file for one of my websites which is a whopping 7.3GB!!!
     
    dfsweb, May 3, 2007 IP
  8. eddy2099

    eddy2099 Peon

    Messages:
    8,028
    Likes Received:
    568
    Best Answers:
    0
    Trophy Points:
    0
    #8
    Do read at least the last 50 lines of the error_log file and correct the issues in the scripts or else the file will grow again in size and the problem will persist.
     
    eddy2099, May 3, 2007 IP