Shell Command to remove older files

Discussion in 'Site & Server Administration' started by xs-admin, Jul 8, 2012.

  1. #1
    Hello Guys,

    I am making backups of a client's website on a remote FTP location. I have a script (usable without root access on cPanel) which is making backups on given cron and transfer it to remote ftp location. Now the real problem is starting; as we can't have unlimited gigabytes of disk space on any server so we have to limit the backups. I was finding shell command (which can be added to cronjob directly or by creating a bash script and call that script from cron. I want to keep 1 week's daily backups. I want to delete any backup from that directory which is older than 1 week. I found following command which looks promising

    find /path/to/files -mtime +30 -exec rm  {}\;
    Code (markup):
    But when I ran this command (for testing I replaced 'rm' with 'ls -l') I got following error

    find: missing argument to `-exec'
    Code (markup):
    can anybody help to resolve this little issue?

    I am running CentOS + cPanel



    Thank You
     
    xs-admin, Jul 8, 2012 IP
  2. gregBS

    gregBS Peon

    Messages:
    12
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    try using:

    find /path/to/files -mtime +30 -exec ls -l '{}' \;
     
    gregBS, Jul 8, 2012 IP
  3. Server Management

    Server Management Peon

    Messages:
    1
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Make sure that there is space between curly brackets "{}" and "\;"
     
    Server Management, Jul 8, 2012 IP
  4. aqualabs

    aqualabs Peon

    Messages:
    24
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    You can use -delete option for 'find' instead of -exec. Just make sure everything is working like it should with use of -print at first.
     
    aqualabs, Jul 8, 2012 IP