LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 11-16-2008, 06:08 PM   #1
SupermanInNY
Member
 
Registered: Jan 2006
Distribution: CentOS
Posts: 30

Rep: Reputation: 15
need to delete a file if >500MB


Hi All,

I need a short script that will be run from a cron.
It needs to grep the size of a file.
If the file size is bigger than 500MB, Delete that file.


Any pointers?

thanks,

-Sup.
 
Old 11-16-2008, 06:26 PM   #2
Sergei Steshenko
Senior Member
 
Registered: May 2005
Posts: 4,481

Rep: Reputation: 454Reputation: 454Reputation: 454Reputation: 454Reputation: 454
Quote:
Originally Posted by SupermanInNY View Post
Hi All,

I need a short script that will be run from a cron.
It needs to grep the size of a file.
If the file size is bigger than 500MB, Delete that file.


Any pointers?

thanks,

-Sup.
'man find' ?
 
Old 11-16-2008, 06:38 PM   #3
colucix
LQ Guru
 
Registered: Sep 2003
Location: Bologna
Distribution: CentOS 6.5 OpenSuSE 12.3
Posts: 10,509

Rep: Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983
Quote:
Originally Posted by Sergei Steshenko View Post
'man find' ?
I think the problem here is not to find files bigger than a certain size, but to see if a particular file is grown up to that size, right? The find command would be very inefficient if the file resides in a upper level of a directory tree. So, why not simply something like...?
Code:
test $(stat -c \%s /path/to/file) -gt 500000000 && rm /path/to/file
 
Old 11-16-2008, 06:38 PM   #4
SupermanInNY
Member
 
Registered: Jan 2006
Distribution: CentOS
Posts: 30

Original Poster
Rep: Reputation: 15
I have the file name. It is a log file that every other weeks is going crazy due to a process running uncontrolled.

I need something like:

PHP Code:
rm -f error_log if ((ll -l error_log |grep <size>) > 5120000000
This script will then be run via cron every 10 minutes.

I just need the right syntax for the code.

thanks,

-Sup.
 
Old 11-16-2008, 06:43 PM   #5
billymayday
LQ Guru
 
Registered: Mar 2006
Location: Sydney, Australia
Distribution: Fedora, CentOS, OpenSuse, Slack, Gentoo, Debian, Arch, PCBSD
Posts: 6,678

Rep: Reputation: 122Reputation: 122
Quote:
Originally Posted by Sergei Steshenko View Post
'man find' ?
Except man find is really hard to work though.

Try something like

find /path/to/start/looking/from -size +500M -exec rm {} \;

Try it without the part from -exec first, just to make sure it's showing the files you actually want to delete (ie "find /path/to/start/looking/from -size +500M")

Then read "man find".

Edit - you may want to change that to "... rm -f {} ..." depending on the user.

Last edited by billymayday; 11-16-2008 at 06:45 PM.
 
Old 11-16-2008, 06:44 PM   #6
SupermanInNY
Member
 
Registered: Jan 2006
Distribution: CentOS
Posts: 30

Original Poster
Rep: Reputation: 15
Quote:
Originally Posted by colucix View Post
I think the problem here is not to find files bigger than a certain size, but to see if a particular file is grown up to that size, right? The find command would be very inefficient if the file resides in a upper level of a directory tree. So, why not simply something like...?
Code:
test $(stat -c \%s /path/to/file) -gt 500000000 && rm /path/to/file
I wrote my bit of the code the same time you wrote yours ) so I saw your post after I replied.

Yes, I think this will work out great!

test $(stat -c \%s /var/log/httpd/error_log) -gt 500000000 && rm -Rf /var/log/httpd/error_log


The only addition I've added was the -Rf so 'don't ask me questions' when it runs.

I now just need to set it in a cron and Restart apache.

Thanks much!

-Sup.
 
Old 11-16-2008, 06:48 PM   #7
colucix
LQ Guru
 
Registered: Sep 2003
Location: Bologna
Distribution: CentOS 6.5 OpenSuSE 12.3
Posts: 10,509

Rep: Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983
Well, if it is a log file you have to stop the process to which the file belongs, before removing. Otherwise the file is not actually removed, that is the inode is preserved (due to the fact the file is in use) and the disk space is not freed.
 
Old 11-16-2008, 06:49 PM   #8
billymayday
LQ Guru
 
Registered: Mar 2006
Location: Sydney, Australia
Distribution: Fedora, CentOS, OpenSuse, Slack, Gentoo, Debian, Arch, PCBSD
Posts: 6,678

Rep: Reputation: 122Reputation: 122
Or just

find /var/log/http -maxdepth 1 -name 'error_log' -size +500M -exec rm -f {} \;

Edit - notwithstanding comment above.

Wouldn't logrotate be good for this too (not every 10 minutes perhaps)?

Last edited by billymayday; 11-16-2008 at 06:50 PM.
 
Old 11-16-2008, 06:52 PM   #9
SupermanInNY
Member
 
Registered: Jan 2006
Distribution: CentOS
Posts: 30

Original Poster
Rep: Reputation: 15
Quote:
Originally Posted by colucix View Post
Well, if it is a log file you have to stop the process to which the file belongs, before removing. Otherwise the file is not actually removed, that is the inode is preserved (due to the fact the file is in use) and the disk space is not freed.
PHP Code:
test $(stat -\%/var/log/httpd/error_log) -gt 500000000 && service httpd stop && rm -Rf /var/log/httpd/error_log && service httpd restart 
crontab -e
*/10 * * * * /etc/cleanApacheErrorLogFile.sh

Does this gets me closer to my target?

Thanks,

-Sup.
 
Old 11-16-2008, 06:55 PM   #10
SupermanInNY
Member
 
Registered: Jan 2006
Distribution: CentOS
Posts: 30

Original Poster
Rep: Reputation: 15
Quote:
Originally Posted by billymayday View Post
Or just

find /var/log/http -maxdepth 1 -name 'error_log' -size +500M -exec rm -f {} \;

Edit - notwithstanding comment above.

Wouldn't logrotate be good for this too (not every 10 minutes perhaps)?

Your solution also sounds good

logrotate is currently configured to run it's normal rotation for this log as well as for other logs within the same path. I don't think you can make logrotate run just for a single file in that path. Also logrotate runs on cron.daily, and I need this to run on 10 minutes intervals.

Thanks,

-Sup.
 
Old 11-28-2008, 06:45 PM   #11
SupermanInNY
Member
 
Registered: Jan 2006
Distribution: CentOS
Posts: 30

Original Poster
Rep: Reputation: 15
Hi All,

I think I'm not able to run the process correctly and need your input on this:

I need the process to delete the error_log file once it is > 500MB.
Just before it deletes it, it needs to STOP httpd, Delete the file, then Restart httpd.

I tried to run this in a cron:


server.myserver.com:/etc # crontab -e

*/10 * * * * /usr/bin/rdate time-a.nist.gov >date -s
0 5 * * 0 /usr/local/sysbk/sysbk -q
*/10 * * * * /etc/cleanApacheErrorLogFile.sh


server.myserver.com:/etc # less cleanApacheErrorLogFile.sh
test $(stat -c \%s /var/log/httpd/error_log) -gt 500000000 && service httpd stop && rm -Rf /var/log/httpd/error_log && service httpd
restart


But, the script doesn't want to run when in crontab.

If I ./cleanApacheErrorLogFile.sh then it runs fine and does what it should.
But when I try to let it run by crond, it doesn't run.


Any pointers on how to make this work correctly?

Thanks,

-Sup.

Last edited by SupermanInNY; 11-28-2008 at 07:54 PM.
 
Old 11-29-2008, 03:42 AM   #12
colucix
LQ Guru
 
Registered: Sep 2003
Location: Bologna
Distribution: CentOS 6.5 OpenSuSE 12.3
Posts: 10,509

Rep: Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983
Cron has a very limited environment, for example the PATH is usually /bin:/usr/bin. Therefore it is a good rule to use full path for every command in your script. Most likely the service command, which is usually in /sbin, is not found. Check also the mail of root: all the standard output and the standard error which is not explicitly redirected to a file is sent to the owner of the crontab (use the mail command to find out).
 
Old 11-29-2008, 10:31 AM   #13
SupermanInNY
Member
 
Registered: Jan 2006
Distribution: CentOS
Posts: 30

Original Poster
Rep: Reputation: 15
Quote:
Originally Posted by colucix View Post
Cron has a very limited environment, for example the PATH is usually /bin:/usr/bin. Therefore it is a good rule to use full path for every command in your script. Most likely the service command, which is usually in /sbin, is not found. Check also the mail of root: all the standard output and the standard error which is not explicitly redirected to a file is sent to the owner of the crontab (use the mail command to find out).

Quote:
/usr/bin # vi cleanApacheErrorLogFile.sh

#!/bin/bash
test $(stat -c \%s /var/log/httpd/test.log1.tar) -gt 500000000 && /usr/sbin/apachectl stop && rm -Rf /var/log/httpd/test.log1.tar && wall wow


/usr/bin # crontab -e

*/10 * * * * /usr/bin/rdate time-a.nist.gov >date -s
0 5 * * 0 /usr/local/sysbk/sysbk -q
* * * * * root /usr/bin/cleanApacheErrorLogFile.sh

And so,. this is with explicit calls, and still nothing.
I'm confused.
As far as the mail tracing:

less /var/log/exim/mainlog

Quote:
2008-11-29 18:19:02 1L6SX8-0006Dl-1W ** root@server.myserver.com: retry timeout exceeded
2008-11-29 18:19:02 1L6SX8-0006Dl-1W root@server.myserver.com: error ignored
2008-11-29 18:19:02 1L6SX8-0006Dl-1W Completed
2008-11-29 18:19:02 1L6SX7-0006Dj-Uj Completed
2008-11-29 18:20:05 1L6SY5-0006GL-2i <= root@server.myserver.com U=root P=local S=652 T="Cron <root@server> root /usr/bin/cleanApacheEr
rorLogFile.sh" from <root@server.myserver.com> for root
2008-11-29 18:20:05 1L6SY5-0006GL-2i User 0 set for local_delivery transport is on the never_users list
2008-11-29 18:20:05 1L6SY5-0006GL-2i == root@server.myserver.com R=localuser T=local_delivery defer (-29): User 0 set for local_deliver
y transport is on the never_users list
2008-11-29 18:20:05 1L6SY5-0006GL-2i ** root@server.myserver.com: retry timeout exceeded
2008-11-29 18:20:05 1L6SY9-0006HV-Du <= <> R=1L6SY5-0006GL-2i U=mail P=local S=1473 T="Mail delivery failed: returning message to se
nder" from <> for root@server.myserver.com
2008-11-29 18:20:05 1L6SY9-0006HV-Du User 0 set for local_delivery transport is on the never_users list
2008-11-29 18:20:05 1L6SY9-0006HV-Du == root@server.myserver.com R=localuser T=local_delivery defer (-29): User 0 set for local_deliver
y transport is on the never_users list
2008-11-29 18:20:05 1L6SY9-0006HV-Du ** root@server.myserver.com: retry timeout exceeded
2008-11-29 18:20:05 1L6SY9-0006HV-Du root@server.myserver.com: error ignored
2008-11-29 18:20:06 1L6SY9-0006HV-Du Completed
2008-11-29 18:20:06 1L6SY5-0006GL-2i Completed
2008-11-29 18:21:01 1L6SZ3-0006KD-Gp <= root@server.myserver.com U=root P=local S=652 T="Cron <root@server> root /usr/bin/cleanApacheEr
rorLogFile.sh" from <root@server.myserver.com> for root
2008-11-29 18:21:01 1L6SZ3-0006KD-Gp User 0 set for local_delivery transport is on the never_users list
2008-11-29 18:21:01 1L6SZ3-0006KD-Gp == root@server.myserver.com R=localuser T=local_delivery defer (-29): User 0 set for local_deliver
y transport is on the never_users list
2008-11-29 18:21:01 1L6SZ3-0006KD-Gp ** root@server.myserver.com: retry timeout exceeded
2008-11-29 18:21:01 1L6SZ3-0006KQ-KB <= <> R=1L6SZ3-0006KD-Gp U=mail P=local S=1473 T="Mail delivery failed: returning message to se
nder" from <> for root@server.myserver.com
2008-11-29 18:21:01 1L6SZ3-0006KQ-KB User 0 set for local_delivery transport is on the never_users list
2008-11-29 18:21:01 1L6SZ3-0006KQ-KB == root@server.myserver.com R=localuser T=local_delivery defer (-29): User 0 set for local_deliver
y transport is on the never_users list
2008-11-29 18:21:01 1L6SZ3-0006KQ-KB ** root@server.myserver.com: retry timeout exceeded
2008-11-29 18:21:01 1L6SZ3-0006KQ-KB root@server.myserver.com: error ignored
2008-11-29 18:21:01 1L6SZ3-0006KQ-KB Completed
2008-11-29 18:21:01 1L6SZ3-0006KD-Gp Completed
2008-11-29 18:22:02 1L6Sa1-0006MJ-LX <= root@server.myserver.com U=root P=local S=652 T="Cron <root@server> root /usr/bin/cleanApacheEr
rorLogFile.sh" from <root@server.myserver.com> for root
2008-11-29 18:22:02 1L6Sa1-0006MJ-LX User 0 set for local_delivery transport is on the never_users list
2008-11-29 18:22:02 1L6Sa1-0006MJ-LX == root@server.myserver.com R=localuser T=local_delivery defer (-29): User 0 set for local_deliver
y transport is on the never_users list
2008-11-29 18:22:02 1L6Sa1-0006MJ-LX ** root@server.myserver.com: retry timeout exceeded

Any pointers on how to make this more usefull?

Thanks,

-Sup.
 
Old 11-29-2008, 07:28 PM   #14
SupermanInNY
Member
 
Registered: Jan 2006
Distribution: CentOS
Posts: 30

Original Poster
Rep: Reputation: 15
Finaly the solution that works and added some logging.

The solution was provided by tillo in the DirectAdmin support forums:

http://www.directadmin.com/forum/sho...d=1#post144619

###########

Placed in the /etc/cron.d/error_log


MAILTO=email@mydomain.com
* * * * * root find /var/log/httpd/error_log -size +500M -exec rm '{}' \; -exec /usr/sbin/apachectl restart \; -exec sh -c 'echo $(date): $1 erased and httpd restarted |tee -a /var/log/httpd/CleanApacheRestarts.log' {} \; 2>/dev/null


This seems to work fine, also adds the email of the event with timestamp and logs it on the server.

Thanks for everyone's help.


-Sup.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
how to delete last number/word of a file and incude file count at the end of the chennaiguy Linux - Newbie 2 02-18-2008 09:08 PM
free says 500mb memory used... ps aux says otherwise Fayte Linux - Software 5 04-25-2005 05:47 AM
Need to find a distro that's 500mb. mjaleo Linux - General 4 03-23-2005 11:45 PM
Tried to delete file as root but it says I don't have permission to delete it! beejayzed Mandriva 23 03-12-2004 02:46 AM
500mb hd for swap file banelion Linux - General 10 07-04-2002 09:56 AM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 07:34 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration