LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Programming (https://www.linuxquestions.org/questions/programming-9/)
-   -   script to auto delete files older than X days (https://www.linuxquestions.org/questions/programming-9/script-to-auto-delete-files-older-than-x-days-505942/)

nocnoc 11-29-2006 11:42 AM

script to auto delete files older than X days
 
I'm looking for a script that will delete all files older than X days from all directories /home/<directories> including sub directories leaving only the directories untouched.

For starters I've been toying with this script that I got from a friend.

#!/bin/bash

for dirs in A B C X Y Z ; do
\rm -rf $dirs/* && sleep 1; echo "removing the contents from => $dirs"
done

I also had a modified one that had mtime in it but that one pretty much deleted everything including the sub directories.

I have a list of about 136 folders I want to purge on a weekly basis and would like to run a script against the /home directory.

Thanks

matthewg42 11-29-2006 12:06 PM

The find command is probable a better approach, mixed with the -mtime option. This will output a list of files which you can then pass to "xargs rm -f" to delete them. Consider using the -print0 option to find and the -0 option to xargs. I'm not going to post code, because I suspect this is homework.

nocnoc 11-29-2006 12:35 PM

Thanks for the reply. I'll try the find with xargs.

Quote:

Originally Posted by matthewg42
I'm not going to post code, because I suspect this is homework.

LOL that was funny. Check the link in my sig. You'll probably think otherwise.

matthewg42 11-29-2006 12:53 PM

I don't know why I'm being so up-tight really. There are examples of find and xargs all over the forums anyhow. Too much coffee probably. :o

nocnoc 11-29-2006 02:56 PM

No worries ;)

chrism01 11-29-2006 09:57 PM

You can use the
-type f
option of 'find' to only action files, not dirs etc

igorc 11-30-2006 04:59 AM

# find /dir_name -type f -mtime +7 -exec rm -f {} \;

to delete files older than 7 days.

matthewg42 11-30-2006 05:15 AM

Passing the list to xargs and using rm that way is much faster for large file lists. Chrism01's solution spawns one rm process per file. Consider a case where 1000 files need deleting, or 100000... that' a lot of processes.

jschiwal 11-30-2006 05:24 AM

chrism1's post only dealt with adding an option to the find command. So you can still use -print0 with find and -0 with xargs.

I have one line script that extracts the file information from a .k3b file and then removes the files that I had backed up. The first part uses sed to filter out the xml tags and convert some "&amp;" -> "&", etc. to handle reserved xml characters. I then pipe this through "tr '\n' '\000'" which does the same thing as finds -print0 options. Then this is piped to the same xargs rm command as I would have used with find -print0.

nocnoc 12-01-2006 07:19 AM

Thanks guys I will try these. We have an FTP site that gets hammered hard and our user NEVER clean it out. Even though we stress that the FTP site is strictly for file transfer and not storage they still try to use it as such. So I really want to automate the process of cleaning it out. And let my userbase know this.

Thanks
Noc

matthewg42 12-01-2006 07:29 AM

nocnoc: First time you do it, maybe it's best to backup their files instead of nuking them. There's no wrath like the user whose files are nuked because they didn't do what they were told. If the /really/ need them back, don't give them what they want too quickly or easily... let em sweat first :)

chrism01 12-03-2006 05:03 PM

concur with backup & let 'em sweat.
Also, get a manager to endorse an email notice that nuking will occur. ... you're gonna need it ;)

nocnoc 12-04-2006 02:08 PM

We deffinitely backup the files first but we don't let them know that. We also keep the pervious weeks data. We have sent out emails in the past about out ftp site being for transfers only, it isn't backed up and there is no redundancy on any of the hardware. Like all IT related emails though, they get ignored until something happens and you ask "did you read the email that was sent out?" The answer is usually no :scratch:

bigearsbilly 12-05-2006 08:11 AM

Quote:

Passing the list to xargs and using rm that way is much faster for large file lists. Chrism01's solution spawns one rm process per file. Consider a case where 1000 files need deleting, or 100000... that' a lot of processes.
FYI did a little test 4000+ files, (I have work but can't do it)

data:
find PRT-reports -type f | wc -l
4093

Code:

time find PRT-reports -type f -exec /usr/bin/rm {} \;

real    1m32.19s
user    0m13.76s
sys    0m28.04s

time find XARGS  -type f | xargs /usr/bin/rm       

real    1m35.41s
user    0m0.46s
sys    0m1.95s


time find xargs_small  -type f | xargs -n29  /usr/bin/rm

real    1m14.10s
user    0m0.78s
sys    0m2.36s
billym.primadtpdev>

conclusion, it don't matter that much

chrism01 12-05-2006 04:44 PM

Exactly, all -type f does is tell find (which runs first) to only action regular files ie ignore dirs, which is what the OP asked for.


All times are GMT -5. The time now is 06:09 PM.