LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Go Back   LinuxQuestions.org > Forums > Enterprise Linux Forums > Linux - Enterprise
User Name
Password
Linux - Enterprise This forum is for all items relating to using Linux in the Enterprise.

Notices


Reply
  Search this Thread
Old 07-17-2015, 02:18 AM   #1
RHCE_ran
Member
 
Registered: Oct 2013
Posts: 90

Rep: Reputation: Disabled
Files older than 1 day not deleted with –mtime +1


I have set a cron job which is something like:

find <path> -iname '*.dmp' -mtime +1 -exec rm -vf {} \;

This job ran at 10 am (17th July) today but did not delete files which were created on 16th July around 2 am though it is more than 24 hours.
I hope I have been able to explain the query that why files older than 1 day are not being picked up.

Requesting a reply to my query.

Regards
 
Old 07-17-2015, 05:02 AM   #2
cliffordw
Member
 
Registered: Jan 2012
Location: South Africa
Posts: 506

Rep: Reputation: 194Reputation: 194
Hi,

Are you sure the job ran? If so, do you have a log of it's output? It might be in your email on that machine if you didn't explicitly redirect it somewhere.

Can you confirm that the "find" is finding all the files, by running:

Code:
find <path> -iname '*.dmp' -mtime +1 -ls
Which user is running this command - root or a regular user? Could there be a problem with file permissions?

Regards,

Clifford
 
1 members found this post helpful.
Old 07-17-2015, 07:34 AM   #3
RHCE_ran
Member
 
Registered: Oct 2013
Posts: 90

Original Poster
Rep: Reputation: Disabled
Thanks for your answer. According to my finding,

find <path> -iname '*.dmp' -mtime +0 -ls

is finding the required files accessed more than 24 hours ago but not +1.

Non-root user is running the command. There is no issue of file permissions.

Should it be

find <path> -iname '*.dmp' -mtime +0 -ls then?

Regards
 
Old 07-17-2015, 11:07 AM   #4
rknichols
Senior Member
 
Registered: Aug 2009
Distribution: CentOS
Posts: 4,535

Rep: Reputation: 2077Reputation: 2077Reputation: 2077Reputation: 2077Reputation: 2077Reputation: 2077Reputation: 2077Reputation: 2077Reputation: 2077Reputation: 2077Reputation: 2077
I guess you can be forgiven for not digging through the rather lengthy find manpage, but the answer is there.
-atime n
File was last accessed n*24 hours ago. When find figures out how many 24-hour periods ago the file was last accessed, any fractional part is ignored, so to match -atime +1, a file has to have been accessed at least two days ago.
-mtime n
File’s data was last modified n*24 hours ago. See the comments for -atime to understand how rounding affects the interpretation of file modification times.
So yes, you would need to use "+0".
 
1 members found this post helpful.
Old 07-17-2015, 11:08 AM   #5
cliffordw
Member
 
Registered: Jan 2012
Location: South Africa
Posts: 506

Rep: Reputation: 194Reputation: 194
Hi again,

You mention files accessed more than 24 hours ago. Are you interested in access time, or modification time ("mtime")?

The "+1" is the correct usage, and should work (assuming mtime is what you want to check).

Could you possibly run the "stat" command for one of these files, and post the output?
 
Old 07-17-2015, 11:12 AM   #6
cliffordw
Member
 
Registered: Jan 2012
Location: South Africa
Posts: 506

Rep: Reputation: 194Reputation: 194
Quote:
Originally Posted by rknichols View Post
I guess you can be forgiven for not digging through the rather lengthy find manpage, but the answer is there.
-atime n
File was last accessed n*24 hours ago. When find figures out how many 24-hour periods ago the file was last accessed, any fractional part is ignored, so to match -atime +1, a file has to have been accessed at least two days ago.
-mtime n
File’s data was last modified n*24 hours ago. See the comments for -atime to understand how rounding affects the interpretation of file modification times.
So yes, you would need to use "+0".
Ah, thanks for that! I misunderstood then, and learnt something new today :-)
 
Old 07-17-2015, 11:15 AM   #7
cliffordw
Member
 
Registered: Jan 2012
Location: South Africa
Posts: 506

Rep: Reputation: 194Reputation: 194
RHCE_ran, if you're looking for files older than 24 hours exactly (which is what I though "-mtime +1" does), as opposed to files modified before yesterday, try:
Code:
find <path> -iname '*.dmp' -mmin +1440 -ls
This finds files older that 1440 minutes, which is 24 hours, and doesn't do the same rounding the -mtime does.
 
1 members found this post helpful.
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Is it food have find command running in cron every day once to delete older files xombboxer Linux - Newbie 4 02-26-2015 12:15 AM
I need to kill the processes older then then one day deepak_message Linux - Server 6 05-28-2012 03:59 AM
Script to delete logfiles older than 1 day madhumithamohan Linux - Newbie 12 07-06-2011 07:59 AM
find one day older files kirukan Linux - Newbie 3 05-31-2011 10:37 PM
Problem logging deleted files older than X Azhrarn Linux - Software 4 10-24-2008 08:04 AM

LinuxQuestions.org > Forums > Enterprise Linux Forums > Linux - Enterprise

All times are GMT -5. The time now is 04:22 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration