LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices

Reply
 
Search this Thread
Old 02-24-2003, 11:13 AM   #1
Jello
LQ Newbie
 
Registered: Feb 2003
Location: Canada
Distribution: Red Hat
Posts: 5

Rep: Reputation: 0
rm command is choking on large amounts of data?


Has anyone ever had a problem with rm command? Does it have some sort of limitation to how much it can delete at one time? We use it in a script to delete old mail that has been archived, but since the amount of mail is so large, rm returns an error. Basically chocking in the middle of the operation, and telling us that it couldn't that amount of data (not due to any write protection if that's what you're thinking)

If anyone has any suggestions on how to fix, or if there is an alternative to rm, please let me know.


Thanks,


Mark
 
Old 02-24-2003, 11:56 AM   #2
fsbooks
Member
 
Registered: Jan 2002
Location: Missoula. Montana, USA
Distribution: fedora, slackware, suse
Posts: 447

Rep: Reputation: 31
Is the problem the number of files? What is the *exact* error message? If I have too many files in a directory for rm to handle, something like this works...

$ find . -name <filespec> -print |xargs rm -f

Read the man page for rm
 
Old 02-24-2003, 11:58 AM   #3
nxny
Member
 
Registered: May 2002
Location: AK - The last frontier.
Distribution: Red Hat 8.0, Slackware 8.1, Knoppix 3.7, Lunar 1.3, Sorcerer
Posts: 771

Rep: Reputation: 30
Welcome to LQ, Jello!

What volume of data are we talking about? And what was the error that rm gave you?
 
Old 02-24-2003, 12:02 PM   #4
Jello
LQ Newbie
 
Registered: Feb 2003
Location: Canada
Distribution: Red Hat
Posts: 5

Original Poster
Rep: Reputation: 0
I'll have to let you know once the other admin gets back from lunch. I don't have root access to that server =(

But from what I recall, it has to do with the amount of files, not the size.

*update*

Ok, here's the error

List of arguments too long.


Mean anything to ya?

Last edited by Jello; 02-24-2003 at 12:18 PM.
 
Old 02-24-2003, 05:42 PM   #5
nxny
Member
 
Registered: May 2002
Location: AK - The last frontier.
Distribution: Red Hat 8.0, Slackware 8.1, Knoppix 3.7, Lunar 1.3, Sorcerer
Posts: 771

Rep: Reputation: 30
Ohh. I see.

Yes, there is a limit to the number of arguments that you can pass to the rm command at a time. In my system, this limit was 12806 when I tried it this morning. I dont know what imposes this limit yet, but I've a feeling I may be able to find it out.

The way I would cirvumvent this limitation is by using
Code:
find . -name 'file*' | xargs rm
where I would normally use
Code:
rm file*
 
Old 02-24-2003, 06:23 PM   #6
Crashed_Again
Senior Member
 
Registered: Dec 2002
Location: Atlantic City, NJ
Distribution: Ubuntu & Arch
Posts: 3,503

Rep: Reputation: 57
what about the most dangerous linux command of all:

rm -rf /bye/bye/data

???? Have you tried this?
 
Old 02-24-2003, 07:15 PM   #7
nxny
Member
 
Registered: May 2002
Location: AK - The last frontier.
Distribution: Red Hat 8.0, Slackware 8.1, Knoppix 3.7, Lunar 1.3, Sorcerer
Posts: 771

Rep: Reputation: 30
Quote:
Originally posted by Crashed_Again
what about the most dangerous linux command of all:

rm -rf /bye/bye/data

???? Have you tried this?
I have. If you have more arguments than rm will take, 'being forcefully recursive' wouldnt help.
 
Old 02-25-2003, 08:12 AM   #8
Jello
LQ Newbie
 
Registered: Feb 2003
Location: Canada
Distribution: Red Hat
Posts: 5

Original Poster
Rep: Reputation: 0
Well, here's the script that contains the commands. One thing we were thinking of doing is using rm to remove the directory instead of all the files in the directory. Not sure if that's going to work, but worth a shot. Then write some lines in the script to recreate the directory and give ownership back to each user.




#!/usr/bin/perl
#doc
#----------------------------------------------------
# Script for creating zipped tarballs of the "archive"
# mail directories in /home/e-smith/files/users/.
# These directories tend to get pretty filled up, so
# it's worth setting this as a cron job.
#
# What this does:
# 1. Tar/zips all files in the user's Maildir/new/
# directory for the archive [directories are usually
# "/home/mail/files/users/username.arc" although
# if 'username' is very long, it will be shortened
# but still similar]. The outputted .tgz will be
# placed in the users directory with the date appended.
#
# 2. Clears out the entirety of the
# "users/username.arc/Maildir/new" directory.
#
# 3. Repeats the process for all user directories
#
#----------------------------------------------------
# Revisions
# 0.0.1 - 2001/12/20 -- mf. - started script. 2002/04/09 -- dc. -added
users
#
#----------------------------------------------------
#

# variables

# user directories located in /home/mail/users/

@users = ("jdoe");

# path to users directories
$mailpath = "/home/mail/files/users/";

#saved for testing -- $mailpath = "/root/test/";
$arc_dir = "/Maildir/new/*";

# repeat for each user -- 1st tarzip, then clear directory.
foreach $userdir (@users) {

$userdir_arc = $userdir.".arc";
print $userdir_arc;
print "\n";

# get the date to append to tarzip
$date = `date +%y%m%d`;

# pull out the \n
chomp($date);


# tarzip the contents of each userdir
system "tar -czf ".$mailpath.$userdir.$date.".tgz
".$mailpath.$userdir_arc;
# clear the contents of the mail directory
system "rm -f ".$mailpath.$userdir_arc.$arc_dir."";
print "\n";
}


print "Done\n";
 
Old 02-25-2003, 01:28 PM   #9
nxny
Member
 
Registered: May 2002
Location: AK - The last frontier.
Distribution: Red Hat 8.0, Slackware 8.1, Knoppix 3.7, Lunar 1.3, Sorcerer
Posts: 771

Rep: Reputation: 30
The script looks good. I dont see why it wouldnt work.

Regarding the original issue, it sounds like xargs is the way to go. I didn't think so much before I suggested it to you, but if case you're interested, you may look at the thread I posted on the Ext3 users list and the thoughts it has generated. Comments from SCT ( one of the ext3 core developers ) are especially worth noting.

http://www.redhat.com/mailing-lists/...ers/index.html
 
Old 02-25-2003, 02:22 PM   #10
georgew
LQ Newbie
 
Registered: Feb 2002
Location: Alaska
Distribution: Fedora 5, Solaris 10, true64bit unix
Posts: 21

Rep: Reputation: 15
Guess this is a Question as much as a reply, but through at jobs, we never rm anything, just mv to /dev/null. Are there any limits to mv?
 
Old 02-25-2003, 04:13 PM   #11
nxny
Member
 
Registered: May 2002
Location: AK - The last frontier.
Distribution: Red Hat 8.0, Slackware 8.1, Knoppix 3.7, Lunar 1.3, Sorcerer
Posts: 771

Rep: Reputation: 30
My conclusion is that this is a shell limitation. All we need to do is to let a program that can handle wildcard expansion pass it to rm, but not the shell itself.

BTW, is there a specific reason why you would move something to /dev/null instead of using rm?

Last edited by nxny; 02-25-2003 at 05:32 PM.
 
Old 02-25-2003, 06:01 PM   #12
sn9ke_eyes
Member
 
Registered: Feb 2003
Location: Texas
Distribution: slackware 11
Posts: 90

Rep: Reputation: 15
whenever rm isn't big enough to do the job, find usually is.

find . -name filename -exec rm -f {} \; -print

do a man on find if you need more options.
 
Old 02-25-2003, 07:39 PM   #13
nxny
Member
 
Registered: May 2002
Location: AK - The last frontier.
Distribution: Red Hat 8.0, Slackware 8.1, Knoppix 3.7, Lunar 1.3, Sorcerer
Posts: 771

Rep: Reputation: 30
Quote:
Originally posted by sn9ke_eyes
whenever rm isn't big enough to do the job, find usually is.

find . -name filename -exec rm -f {} \; -print

do a man on find if you need more options.
No offense, but the find command has been suggested in a previous post in this thread. I'd suggest reading the entire thread ( especially when it comes to threads small as this one ) before posting just so your post wouldn't appear redundant.
 
Old 02-26-2003, 08:33 AM   #14
Jello
LQ Newbie
 
Registered: Feb 2003
Location: Canada
Distribution: Red Hat
Posts: 5

Original Poster
Rep: Reputation: 0
There you have it. My email to fileutils bug report got a response.

Other than that the, solution using 'find works fine. Thanks for your help everyone.


>
> Just a quick note that it has been verified here that the rm >command has a limitation of how many files it can delete in a >single command. (i took out the rest of the message, as it was usless)

Thanks for your report. But that is not a limitation of the rm
command but a limitation of your operating system kernel. If your
operating system could handle it then rm could too.


See the FAQ entry here. Look for "Argument list too long".

http://www.gnu.org/software/fileutils/doc/faq/

In a nutshell you are exceeding the space needed to stage those files so that the command can see them in the arg list. Use find and xargs as shown in the referenced faq to avoid the limit.
 
Old 02-26-2003, 12:50 PM   #15
nxny
Member
 
Registered: May 2002
Location: AK - The last frontier.
Distribution: Red Hat 8.0, Slackware 8.1, Knoppix 3.7, Lunar 1.3, Sorcerer
Posts: 771

Rep: Reputation: 30
Kicks butt!! Answers all questions. And I blamed bash

Thanks for the update, Jello. Learned something valuable.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
large data trasfer problem mos definitely General 2 12-27-2004 05:23 PM
ncpfs and samba / problems with large data jraute Linux - Software 0 09-27-2004 03:30 AM
need help on processing large data files eph Programming 3 03-11-2004 04:56 AM
perl->large data structures->copying lackluster Programming 0 08-20-2003 10:27 AM
Large data files on CD dema Linux - Newbie 1 01-26-2002 10:30 PM


All times are GMT -5. The time now is 11:31 AM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration