Removing directory that is too large to use find and xargs
I have a server that uses files to handle sessions under php, with quite a long session life time (about a month). I didn't look at it for a while (a couple of months), when sessions stopped working properly.
The session folder was over 1 gig. I tried a couple of ways of dealing with it: rm -rf session (from the directory above) - doesn't seem to do anything after 10 hours sitting there with httpd off. Should I wait longer? within the directory: find . -name * | xargs rm -f find gives the error, argument list too long (note not rm, it's find that's crapping out). This is a CentOS installation, and unlink as superuser isn't allowed. The naming convention for the session is pretty predictable so I could write a looping script to handle the find in smaller batches, but I was just wondering if there was a simpler way to do this. |
Your find command isn't correct. You need to put double quotes aroung the asterisk. Since you want to match every file, "find ./ -type f" would be better. You still want to limit the number of arguments that xargs uses. Use the -L argument for that. Also, you might want to separate arguments with nulls. That will eliminate the problem of a filename containing evil characters.
find ./ -type f -print0 | xargs -0 -L 1000 rm |
All times are GMT -5. The time now is 11:47 PM. |