LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices

Reply
 
Search this Thread
Old 11-29-2006, 11:42 AM   #1
nocnoc
LQ Newbie
 
Registered: Nov 2006
Posts: 8

Rep: Reputation: 0
script to auto delete files older than X days


I'm looking for a script that will delete all files older than X days from all directories /home/<directories> including sub directories leaving only the directories untouched.

For starters I've been toying with this script that I got from a friend.

#!/bin/bash

for dirs in A B C X Y Z ; do
\rm -rf $dirs/* && sleep 1; echo "removing the contents from => $dirs"
done

I also had a modified one that had mtime in it but that one pretty much deleted everything including the sub directories.

I have a list of about 136 folders I want to purge on a weekly basis and would like to run a script against the /home directory.

Thanks
 
Old 11-29-2006, 12:06 PM   #2
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 63
The find command is probable a better approach, mixed with the -mtime option. This will output a list of files which you can then pass to "xargs rm -f" to delete them. Consider using the -print0 option to find and the -0 option to xargs. I'm not going to post code, because I suspect this is homework.
 
Old 11-29-2006, 12:35 PM   #3
nocnoc
LQ Newbie
 
Registered: Nov 2006
Posts: 8

Original Poster
Rep: Reputation: 0
Thanks for the reply. I'll try the find with xargs.

Quote:
Originally Posted by matthewg42
I'm not going to post code, because I suspect this is homework.
LOL that was funny. Check the link in my sig. You'll probably think otherwise.
 
Old 11-29-2006, 12:53 PM   #4
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 63
I don't know why I'm being so up-tight really. There are examples of find and xargs all over the forums anyhow. Too much coffee probably.
 
Old 11-29-2006, 02:56 PM   #5
nocnoc
LQ Newbie
 
Registered: Nov 2006
Posts: 8

Original Poster
Rep: Reputation: 0
No worries
 
Old 11-29-2006, 09:57 PM   #6
chrism01
Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Centos 6.5, Centos 5.10
Posts: 16,269

Rep: Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028
You can use the
-type f
option of 'find' to only action files, not dirs etc
 
Old 11-30-2006, 04:59 AM   #7
igorc
Member
 
Registered: May 2005
Location: Sydney, Australia
Distribution: Ubuntu 5.04, Debian 3.1
Posts: 74

Rep: Reputation: 15
# find /dir_name -type f -mtime +7 -exec rm -f {} \;

to delete files older than 7 days.
 
Old 11-30-2006, 05:15 AM   #8
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 63
Passing the list to xargs and using rm that way is much faster for large file lists. Chrism01's solution spawns one rm process per file. Consider a case where 1000 files need deleting, or 100000... that' a lot of processes.
 
Old 11-30-2006, 05:24 AM   #9
jschiwal
Guru
 
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733

Rep: Reputation: 654Reputation: 654Reputation: 654Reputation: 654Reputation: 654Reputation: 654
chrism1's post only dealt with adding an option to the find command. So you can still use -print0 with find and -0 with xargs.

I have one line script that extracts the file information from a .k3b file and then removes the files that I had backed up. The first part uses sed to filter out the xml tags and convert some "&amp;" -> "&", etc. to handle reserved xml characters. I then pipe this through "tr '\n' '\000'" which does the same thing as finds -print0 options. Then this is piped to the same xargs rm command as I would have used with find -print0.
 
Old 12-01-2006, 07:19 AM   #10
nocnoc
LQ Newbie
 
Registered: Nov 2006
Posts: 8

Original Poster
Rep: Reputation: 0
Thanks guys I will try these. We have an FTP site that gets hammered hard and our user NEVER clean it out. Even though we stress that the FTP site is strictly for file transfer and not storage they still try to use it as such. So I really want to automate the process of cleaning it out. And let my userbase know this.

Thanks
Noc
 
Old 12-01-2006, 07:29 AM   #11
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 63
nocnoc: First time you do it, maybe it's best to backup their files instead of nuking them. There's no wrath like the user whose files are nuked because they didn't do what they were told. If the /really/ need them back, don't give them what they want too quickly or easily... let em sweat first
 
Old 12-03-2006, 05:03 PM   #12
chrism01
Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Centos 6.5, Centos 5.10
Posts: 16,269

Rep: Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028
concur with backup & let 'em sweat.
Also, get a manager to endorse an email notice that nuking will occur. ... you're gonna need it
 
Old 12-04-2006, 02:08 PM   #13
nocnoc
LQ Newbie
 
Registered: Nov 2006
Posts: 8

Original Poster
Rep: Reputation: 0
We deffinitely backup the files first but we don't let them know that. We also keep the pervious weeks data. We have sent out emails in the past about out ftp site being for transfers only, it isn't backed up and there is no redundancy on any of the hardware. Like all IT related emails though, they get ignored until something happens and you ask "did you read the email that was sent out?" The answer is usually no
 
Old 12-05-2006, 08:11 AM   #14
bigearsbilly
Senior Member
 
Registered: Mar 2004
Location: england
Distribution: FreeBSD, Debian, Mint, Puppy
Posts: 3,287

Rep: Reputation: 173Reputation: 173
Quote:
Passing the list to xargs and using rm that way is much faster for large file lists. Chrism01's solution spawns one rm process per file. Consider a case where 1000 files need deleting, or 100000... that' a lot of processes.
FYI did a little test 4000+ files, (I have work but can't do it)

data:
find PRT-reports -type f | wc -l
4093

Code:
time find PRT-reports -type f -exec /usr/bin/rm {} \;

real    1m32.19s
user    0m13.76s
sys     0m28.04s

time find XARGS  -type f | xargs /usr/bin/rm         

real    1m35.41s
user    0m0.46s
sys     0m1.95s


time find xargs_small  -type f | xargs -n29  /usr/bin/rm

real    1m14.10s
user    0m0.78s
sys     0m2.36s
billym.primadtpdev>
conclusion, it don't matter that much
 
Old 12-05-2006, 04:44 PM   #15
chrism01
Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Centos 6.5, Centos 5.10
Posts: 16,269

Rep: Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028Reputation: 2028
Exactly, all -type f does is tell find (which runs first) to only action regular files ie ignore dirs, which is what the OP asked for.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Bash script to remove files older than 3 days rust8y Linux - General 26 10-04-2012 08:26 AM
How to move files older than 30 days gfem AIX 8 11-08-2006 04:58 AM
delete files older then a month? Red Squirrel Linux - Software 1 10-05-2005 10:54 PM
delete files older than 30 days using cronjob latheesan Linux - Newbie 5 06-14-2005 02:40 PM
delete files in server that is older than 30 days using cronjob latheesan *BSD 2 06-14-2005 12:37 PM


All times are GMT -5. The time now is 09:50 PM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration