LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 10-14-2013, 09:39 AM   #1
SINEKT
LQ Newbie
 
Registered: Oct 2013
Posts: 6

Rep: Reputation: Disabled
Crontab weird behavior on CentOS 5.5


Hello there !

This is my crontab:

Code:
0,30 * * * * /etc/init.d/mc backup
0 4 * * * find /var/spool/clientmqueue -ctime 1 -exec rm -rf {} \;
10 4 * * * find /backup/minecraft.backup -ctime 1 -exec rm -f {} \;
20 4 * * * cd /backup/minecraft.backup/ && ls -lt * | head -1 | awk '{print "cp " $9 " /backup/minecraft.backup/daily/"$9}' | sh
#
Explanations:

Line 1 - backup script for game server
Line 2 - empty useless files on OS
Line 3 - deleting backups older than a day from /backup/minecraft.backup
Line 4 - copying to another folder the newest file from /backup/minecraft.backup, in order to have something like folder 1 with the 30 min. backups from the last day, folder 2 with a file from each day, for a week span and so on, so that I can choose older backups if I need to, without having a lot of space occupied.

The weird thing is as follows: the last command is supposed to copy the latest file from /backup/minecraft.backup/ to the daily subfolder AND JUST THAT. And that's what it really does if I run it manually. However, if it runs automatically through crontab, it also deletes files older than a day from the daily subfolder. That is not what I want, I just want it to copy files there without deleting. Funny thing is that I tried running all the commands manually, in the same order as they're in the crontab, and that doesn't delete anything, so it only happens when it runs automatically through crontab.

Anybody knows what's going on here?
 
Old 10-14-2013, 07:48 PM   #2
kbp
Senior Member
 
Registered: Aug 2009
Posts: 3,790

Rep: Reputation: 653Reputation: 653Reputation: 653Reputation: 653Reputation: 653Reputation: 653
I'm guessing that:
Code:
10 4 * * * find /backup/minecraft.backup -ctime 1 -exec rm -f {} \;
.. is doing it.

If you don't want it to descend into subdirectories then try adding 'maxdepth':
Code:
10 4 * * * find /backup/minecraft.backup -ctime 1 -maxdepth 0 -exec rm -f {} \;
 
Old 10-15-2013, 02:19 AM   #3
SINEKT
LQ Newbie
 
Registered: Oct 2013
Posts: 6

Original Poster
Rep: Reputation: Disabled
Aha, well I changed it and now I'll wait another day to see the result.
 
Old 10-16-2013, 05:52 AM   #4
SINEKT
LQ Newbie
 
Registered: Oct 2013
Posts: 6

Original Poster
Rep: Reputation: Disabled
0 doesn't work, it doesn't delete anything, I tried a find with maxdepth 0, and it only gives me the current directory, not the files in it. I think it should be maxdepth 1. I modified the command and I'm waiting to see what happens.
 
Old 10-16-2013, 07:46 AM   #5
kbp
Senior Member
 
Registered: Aug 2009
Posts: 3,790

Rep: Reputation: 653Reputation: 653Reputation: 653Reputation: 653Reputation: 653Reputation: 653
Sorry, I thought that would work for directories... don't forget you can use the 'touch' command to create a test file with any timestamp you want which would avoid having to wait 24hrs.
 
Old 10-17-2013, 08:08 AM   #6
SINEKT
LQ Newbie
 
Registered: Oct 2013
Posts: 6

Original Poster
Rep: Reputation: Disabled
Yeah I know, that was just for final testing, in the final form.

I don't know why though, I still can't make it work.

I'll just tell you what I want to do, maybe you can come up with something that works, I won't have much time these days, and I will be out of town for the weekend, and I have no idea if I'll have internet access.

1. A script makes zip-backups every half an hour in folder /backup, that is just there and it works.
2. Every night at 4 AM delete files older than 1 day from /backup
3. Every night at 4:10 AM copy newest file from /backup to /daily

I'm testing something now, maybe I'll do it in the end...
 
Old 10-17-2013, 08:21 AM   #7
Habitual
LQ Veteran
 
Registered: Jan 2011
Location: Abingdon, VA
Distribution: Catalina
Posts: 9,374
Blog Entries: 37

Rep: Reputation: Disabled
Code:
find /backup/minecraft.backup -ctime +1 -delete
?
 
Old 10-17-2013, 08:50 AM   #8
SINEKT
LQ Newbie
 
Registered: Oct 2013
Posts: 6

Original Poster
Rep: Reputation: Disabled
I think I found something, I don't think it works without the + and I pasted the text in crontab using Joe, and for some reason it removed the + (probably treating it as a positive number), and I failed to see that.

Is there really a big difference if I use mtime or ctime?
 
Old 10-17-2013, 09:07 AM   #9
kbp
Senior Member
 
Registered: Aug 2009
Posts: 3,790

Rep: Reputation: 653Reputation: 653Reputation: 653Reputation: 653Reputation: 653Reputation: 653
No, the files are backups and are never modified after creation.
 
Old 10-21-2013, 02:40 AM   #10
SINEKT
LQ Newbie
 
Registered: Oct 2013
Posts: 6

Original Poster
Rep: Reputation: Disabled
I just moved daily into a separate folder, so now I have /backup/minecraft.backup and /backup/minecraft.daily, and now it works just fine.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
CentOS Weird behavior, Maybe I got hacked? [URGENT] AsadMoeen Linux - Server 10 03-01-2011 11:53 AM
Logrotate weird behavior musclehead Linux - Newbie 5 06-19-2009 11:57 AM
Weird behavior!! surfer41 Linux - Networking 1 04-25-2006 07:53 AM
Weird behavior Bassy Linux - Software 2 10-20-2005 01:32 PM
Weird Behavior in KDE 3.3 haldara Linux - Newbie 3 09-24-2004 10:59 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 05:54 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration