find exclude directory
Hi All,
I have a small script which using the find command will delete certain files with in a given path. All works but now I need to exclude a directory or directories but cant seem to get it to work can anyone help? This is the one line that does the delete find $path -type f \( -iname \*.csv -o -iname \*.xml -o -iname \*.zip -o -iname \*.gz \) -mtime $olderthan -delete |
I think you want -path with -prune
From man find: Code:
-path pattern Code:
-path ./some/directory* -prune |
Classic use case for prune - but then you can't use -delete.
Pipe it to xargs for rm. This also allows you to use echo first to test the list you generate - a very prudent step IMHO. |
Good point, but simple xargs creates a problem (and even a risk) if there are space characters in file mames.
Rather than -print0 | xargs -0 rm I suggest to do everything in find -exec rm {} + Have the -prune first and continue with -o (otherwise)! |
Quote:
Code:
find $path -name some_directory -prune -o \ If multiple directories need to be pruned, then: Code:
find $path \( -name dir1 -o -name dir2 -o -name dir3 \) -prune -o \ Code:
find $path ! \( \( -name dir1 -o -name dir2 -o -name dir3 \) -prune \) \ |
Quote:
|
Quote:
|
lol - stoppit ....
|
IMHO a questionable feature
GNU find's "The -delete action automatically turns on -depth" is a bad idea IMHO.
It still could delete files (and maybe even empty directories), and otherwise would silently fail. |
Thanks All for the help. prune seems to have done the job.
Is there a way to record to a log the files that are about to be deleted with their creation date? If I use the -print option, only the processed files are recorded, I could use the find and ls command before the find and delete something like -exec ls -ltr {} + | awk '{print $6,$7,$8,$9}' > $log but this doubles the time it takes to run the script. -print was perfect as it does it all in one go and there is no discrepancy with what was deleted and logged, But having the creation date along with the file name would be ideal as the script works out the older than date so anyone can browse the list to ensure that no files later than the 'older than date' have been deleted. Any ideas? |
You can have -exec ls ... and -exec rm ... in sequence
Code:
find ... -exec ls -ldUo --file-type --time-style='+%Y-%m-%d %H:%M' {} + -exec rm -f {} + | |
You can use "-printf '%t %p\n'", which also accepts a very complete list of format directives if you want other information or specific formatting for the timestamp.
Note that none of these suggestions is going to give you "creation time". The best you can do is "modification time". Some filesystems do support "creation time", but support for that is lacking in the tools. I should add that the disadvantage of using "-printf ..." is that the output is not sorted. The order will be that in which the files were found, and for many filesystems that is essentially unpredictable. |
Nice! So it boils down to something like
Code:
find $path -path $excludepath -prune -o -type f \( -iname \*.csv -o -iname \*.xml -o -iname \*.zip -o -iname \*.gz \) -mtime $olderthan -printf "%TY-%Tm-%Td %TH:%TM %p\n" -exec rm -f {} + > $log |
Quote:
|
Thanks all for taking time out to help a newb. I read about printf in the man page but it didn't make much sense. It all works a treat - Thanks!
|
All times are GMT -5. The time now is 04:09 AM. |