LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 12-12-2016, 02:09 AM   #1
Ujio
LQ Newbie
 
Registered: Sep 2011
Posts: 17

Rep: Reputation: Disabled
NFS problem on mutliple file delete operation


Hello

I have Netapp NFS drive and 6 RHEL linux servers currently access same share.

My problems is related with multiple file delete operation from NFS drive
I could create files in NFS mount with specific users. There no problem in this step.
I have housekeeping scripts for example deleting daily files. Number of daily files about 30000-40000. (total size about 3-4 TB)

My problem starts here. When I delete this files (cron script) in one command like "rm -rf *201612080*" or "rm *201612080*" commands sometimes could not delete any file or sometimes delete all files. When the command fails there is no error, command thinks, OS executed request without no problem. So this problem needs some additional checkings day by day, after cron script.

Another strange issue is.
Code:
rm <file1>; rm <file2>; rm <file3> ....
works everytime but following small script works sometime

Quote:
for i in *
do
rm $i
done
So why cannot delete multiple files sometime ?

Thanks
 
Old 12-12-2016, 07:31 AM   #2
michaelk
Moderator
 
Registered: Aug 2002
Posts: 25,698

Rep: Reputation: 5895Reputation: 5895Reputation: 5895Reputation: 5895Reputation: 5895Reputation: 5895Reputation: 5895Reputation: 5895Reputation: 5895Reputation: 5895Reputation: 5895
With that many files your most likely getting an argument list to long error. The wild card actually expands so rm *xxxx* becomes rm file1 file2 file3 ... file400000. A better way to delete that many files is to use find.

find /dir/to/files -type f -name "*201612080*" -exec rm -f {} \;
 
Old 12-12-2016, 06:18 PM   #3
MadeInGermany
Senior Member
 
Registered: Dec 2011
Location: Simplicity
Posts: 2,789

Rep: Reputation: 1201Reputation: 1201Reputation: 1201Reputation: 1201Reputation: 1201Reputation: 1201Reputation: 1201Reputation: 1201Reputation: 1201
If during the deletion there are files created (or deleted) by other processes then it becomes a race condition. If several NFS clients are involved, the NFS caches make the problem worse.
The list generated for the for loop is another buffer.
For mitigation, you must buffer less, like the suggested find.
Make sure your deletion does not run on several NFS clients simultaneously.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Automatically performing an operation (i.e. delete) on a file after reading it rohitchauhan Linux - General 1 09-18-2013 01:59 AM
Gparted delete operation painfully slow.... vmzhuk Linux - Newbie 4 02-01-2013 12:53 PM
NFS with Nexenta and Operation not permitted Normanu Linux - Server 1 04-05-2012 01:08 PM
Can't delete file on NFS share in Nautilus corkypa Linux - Software 0 10-18-2005 12:13 PM
How to delete directory/file (operation not permitted)? lvgamer1 Linux - Newbie 4 12-28-2004 10:53 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 09:08 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration