LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - General (https://www.linuxquestions.org/questions/linux-general-1/)
-   -   rm command is choking on large amounts of data? (https://www.linuxquestions.org/questions/linux-general-1/rm-command-is-choking-on-large-amounts-of-data-46915/)

Jello 02-24-2003 11:13 AM

rm command is choking on large amounts of data?
 
Has anyone ever had a problem with rm command? Does it have some sort of limitation to how much it can delete at one time? We use it in a script to delete old mail that has been archived, but since the amount of mail is so large, rm returns an error. Basically chocking in the middle of the operation, and telling us that it couldn't that amount of data (not due to any write protection if that's what you're thinking)

If anyone has any suggestions on how to fix, or if there is an alternative to rm, please let me know.


Thanks,


Mark

fsbooks 02-24-2003 11:56 AM

Is the problem the number of files? What is the *exact* error message? If I have too many files in a directory for rm to handle, something like this works...

$ find . -name <filespec> -print |xargs rm -f

Read the man page for rm

nxny 02-24-2003 11:58 AM

Welcome to LQ, Jello!

What volume of data are we talking about? And what was the error that rm gave you?

Jello 02-24-2003 12:02 PM

I'll have to let you know once the other admin gets back from lunch. I don't have root access to that server =(

But from what I recall, it has to do with the amount of files, not the size.

*update*

Ok, here's the error

List of arguments too long.


Mean anything to ya?

nxny 02-24-2003 05:42 PM

Ohh. I see.

Yes, there is a limit to the number of arguments that you can pass to the rm command at a time. In my system, this limit was 12806 when I tried it this morning. I dont know what imposes this limit yet, but I've a feeling I may be able to find it out.

The way I would cirvumvent this limitation is by using
Code:

find . -name 'file*' | xargs rm
where I would normally use
Code:

rm file*

Crashed_Again 02-24-2003 06:23 PM

what about the most dangerous linux command of all:

rm -rf /bye/bye/data

???? Have you tried this?

nxny 02-24-2003 07:15 PM

Quote:

Originally posted by Crashed_Again
what about the most dangerous linux command of all:

rm -rf /bye/bye/data

???? Have you tried this?

I have. If you have more arguments than rm will take, 'being forcefully recursive' wouldnt help.

Jello 02-25-2003 08:12 AM

Well, here's the script that contains the commands. One thing we were thinking of doing is using rm to remove the directory instead of all the files in the directory. Not sure if that's going to work, but worth a shot. Then write some lines in the script to recreate the directory and give ownership back to each user.




#!/usr/bin/perl
#doc
#----------------------------------------------------
# Script for creating zipped tarballs of the "archive"
# mail directories in /home/e-smith/files/users/.
# These directories tend to get pretty filled up, so
# it's worth setting this as a cron job.
#
# What this does:
# 1. Tar/zips all files in the user's Maildir/new/
# directory for the archive [directories are usually
# "/home/mail/files/users/username.arc" although
# if 'username' is very long, it will be shortened
# but still similar]. The outputted .tgz will be
# placed in the users directory with the date appended.
#
# 2. Clears out the entirety of the
# "users/username.arc/Maildir/new" directory.
#
# 3. Repeats the process for all user directories
#
#----------------------------------------------------
# Revisions
# 0.0.1 - 2001/12/20 -- mf. - started script. 2002/04/09 -- dc. -added
users
#
#----------------------------------------------------
#

# variables

# user directories located in /home/mail/users/

@users = ("jdoe");

# path to users directories
$mailpath = "/home/mail/files/users/";

#saved for testing -- $mailpath = "/root/test/";
$arc_dir = "/Maildir/new/*";

# repeat for each user -- 1st tarzip, then clear directory.
foreach $userdir (@users) {

$userdir_arc = $userdir.".arc";
print $userdir_arc;
print "\n";

# get the date to append to tarzip
$date = `date +%y%m%d`;

# pull out the \n
chomp($date);


# tarzip the contents of each userdir
system "tar -czf ".$mailpath.$userdir.$date.".tgz
".$mailpath.$userdir_arc;
# clear the contents of the mail directory
system "rm -f ".$mailpath.$userdir_arc.$arc_dir."";
print "\n";
}


print "Done\n";

nxny 02-25-2003 01:28 PM

The script looks good. I dont see why it wouldnt work.

Regarding the original issue, it sounds like xargs is the way to go. I didn't think so much before I suggested it to you, but if case you're interested, you may look at the thread I posted on the Ext3 users list and the thoughts it has generated. Comments from SCT ( one of the ext3 core developers ) are especially worth noting.

http://www.redhat.com/mailing-lists/...ers/index.html

georgew 02-25-2003 02:22 PM

Guess this is a Question as much as a reply, but through at jobs, we never rm anything, just mv to /dev/null. Are there any limits to mv?

nxny 02-25-2003 04:13 PM

My conclusion is that this is a shell limitation. All we need to do is to let a program that can handle wildcard expansion pass it to rm, but not the shell itself.

BTW, is there a specific reason why you would move something to /dev/null instead of using rm?

sn9ke_eyes 02-25-2003 06:01 PM

whenever rm isn't big enough to do the job, find usually is.

find . -name filename -exec rm -f {} \; -print

do a man on find if you need more options.

nxny 02-25-2003 07:39 PM

Quote:

Originally posted by sn9ke_eyes
whenever rm isn't big enough to do the job, find usually is.

find . -name filename -exec rm -f {} \; -print

do a man on find if you need more options.

No offense, but the find command has been suggested in a previous post in this thread. I'd suggest reading the entire thread ( especially when it comes to threads small as this one ) before posting just so your post wouldn't appear redundant.

Jello 02-26-2003 08:33 AM

There you have it. My email to fileutils bug report got a response.

Other than that the, solution using 'find works fine. Thanks for your help everyone.


>
> Just a quick note that it has been verified here that the rm >command has a limitation of how many files it can delete in a >single command. (i took out the rest of the message, as it was usless)

Thanks for your report. But that is not a limitation of the rm
command but a limitation of your operating system kernel. If your
operating system could handle it then rm could too.


See the FAQ entry here. Look for "Argument list too long".

http://www.gnu.org/software/fileutils/doc/faq/

In a nutshell you are exceeding the space needed to stage those files so that the command can see them in the arg list. Use find and xargs as shown in the referenced faq to avoid the limit.

nxny 02-26-2003 12:50 PM

Kicks butt!! Answers all questions. And I blamed bash :(

Thanks for the update, Jello. Learned something valuable.


All times are GMT -5. The time now is 02:29 PM.