how to archive 135gb 9000 files and delete as it adds to tar.

Discussion in 'Site & Server Administration' started by shacow, May 21, 2008.

  1. #1
    Hi dp members, I am trying to make an archive of some of my files to move off my server, there is about 140gb in the dir, with over 9000 files listed inside. I would like to know if there is any possible way to add the files to an archive, and once added, delete the source file.

    I cant just simple add these to an archive as i only have a 250gb harddrive and ive already used about 180gb.

    So i need some way of, adding the files to a tar (or other archive, which ever has best compression) and then delete the source file before moving on to the next file

    would this be possible with some kind of bash script?

    all my files are named, 1-5000.txt and then i have an img directory with the same layout of files, 1-5000.jpg


    Thank you all for your help.
    Shaun
     
    shacow, May 21, 2008 IP
  2. royo

    royo Peon

    Messages:
    173
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #2
    It would be possible with a bash script, just get someone who knows how to make one for you.
     
    royo, May 21, 2008 IP
  3. shacow

    shacow Active Member

    Messages:
    339
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    60
    #3
    I just hope someone can help me with this because i haven't got long to get the files from my server :(

    I havent a clue on how to write bash scripts :(
     
    shacow, May 21, 2008 IP
  4. olddocks

    olddocks Notable Member

    Messages:
    3,275
    Likes Received:
    165
    Best Answers:
    0
    Trophy Points:
    215
    #4
    use rsync or sch to copy to remote server. you can also make those files to one tar.gz format
     
    olddocks, May 21, 2008 IP
  5. shacow

    shacow Active Member

    Messages:
    339
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    60
    #5
    I cant make them into one file without deleting the source as they get added or i will run out of diskspace.

    and I want to put them into one archive before i move them as I will have to use a download manager on a 140gb file :p I wont be transfering them from server to server, as i need so do some editing to the contents before they go live again.

    thanks
     
    shacow, May 21, 2008 IP
  6. jayshah

    jayshah Peon

    Messages:
    1,126
    Likes Received:
    68
    Best Answers:
    1
    Trophy Points:
    0
    #6
    Hey ;)

    Something like this:

    tar -cf archive.tar folder --remove-files
    Code (markup):
    archive.tar is the tarfile you're producing
    folder is the folder you're archiving.

    Remember, use with caution, but Rep would be appreciated ;)

    Jay
     
    jayshah, May 21, 2008 IP
  7. Whippet75

    Whippet75 Well-Known Member

    Messages:
    1,599
    Likes Received:
    23
    Best Answers:
    0
    Trophy Points:
    155
    #7
    That is a lot of data...I sure hope you arent downloading the archived file from the net....you want an gigabit LAN to transfer ideally.
     
    Whippet75, May 21, 2008 IP
  8. shacow

    shacow Active Member

    Messages:
    339
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    60
    #8
    I will be timing the download and restricting its speed during peak times.
     
    shacow, May 21, 2008 IP