Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

perl deleting

Options
  • 03-06-2006 12:55pm
    #1
    Closed Accounts Posts: 839 ✭✭✭


    Can someone help me out or point me in the right direction.

    I want to delete the 14 oldest files from a folder containing a 100 files, so cron will be set up to call this script once a week.

    All the files are name day-month-year-a.zip, there are two files from each day e.g. 01-01-01-a.zip and 01-01-01-b.zip so basically its removing the files from the 7 oldest days!


Comments

  • Registered Users Posts: 6,508 ✭✭✭daymobrew


    In perl look at the 'stat' function.
    Maybe create a hash where the filenames are the keys (as they will be unique) and the modification date will be the value (retrived from 'stat' call). Use a custom 'sort' routine to sort by hash values and delete ('unlink' function) the appropriate 14 files.

    It might be easier with a shell script.
    ls -ltr = lists files in reverse date order, with oldest first.
    head -14 = returns first 14 lines
    awk '{print $9}' = return 9th item on line (the filename in 'ls -l' output).
    Combine these three inside a 'for' look and 'rm -f' them.


  • Closed Accounts Posts: 2,046 ✭✭✭democrates


    Deleting makes me edgy. I presume there are backups in case the program crashes, or the server clock is altered or some such madness?

    If wrong deletions would be troublesome, you might consider renaming the files and then deleting them the following week or whenever. That way if the wrong ones are renamed you have a chance to recover them when the users come a shrieking.


Advertisement