Removing directory that is too large to use find and xargs
I have a server that uses files to handle sessions under php, with quite a long session life time (about a month). I didn't look at it for a while (a couple of months), when sessions stopped working properly.
The session folder was over 1 gig. I tried a couple of ways of dealing with it:
rm -rf session (from the directory above) - doesn't seem to do anything after 10 hours sitting there with httpd off. Should I wait longer?
within the directory:
find . -name * | xargs rm -f
find gives the error, argument list too long (note not rm, it's find that's crapping out).
This is a CentOS installation, and unlink as superuser isn't allowed.
The naming convention for the session is pretty predictable so I could write a looping script to handle the find in smaller batches, but I was just wondering if there was a simpler way to do this.