-   Linux - Newbie (
-   -   Script - Remove old log file (

shivaa 11-23-2012 11:04 PM

Script - Remove old log file
I have a script which runs every 10 min. and updates a log file named /home/jack/logsfile.

if [ condition true ];
echo "Working fine." >> /home/jack/logsfile

Since this script runs every 10 min and keep appending the result to /home/jack/logsfile, so this log file is getting larger day by day, so I have to manually remove it everyday. I want to add some condition in my script so that, at end of day (like at 11:50 PM or at beginning of that day like at 12:00 midnight) once it remove the old existing file (last day's logsfile) and then when it runs again, it generates a new logsfile and keep appending the result for whole day and so on...

I am not getting a clue that how to do it? If I add a

rm /home/jack/logsfile
at the begining of the script, then it will remove the logsfile everytime it will run. But I want to keep logs (only) for at least a day.

jsaravana87 11-23-2012 11:27 PM

These the script im running everyday once to backup my log


#Script -Purpose:Backup script
# Exits with zero if no error.

# Step -1 Function to archieve file using date and time

date=`/bin/date "+%Y.%m.%d.%H.%M.%S"`

# Step -2 Function to create folder on /root/logs with Timestamp

mkdir -p /root/logs/$date

# Step -3 Function to find & move logs from /data to /root/logs

find /data/ -type f -iname *.logs -mtime +30 -print | xargs -I {} mv {} /root/logs/$date

#Step -4 Function to compress the log Backup Content

tar -cvzf /root/log/$date /root/logarchive/$date.tar.gz

#Step -5 Function to remove the Log

rm -rf /root/log/*

#Step -4 Function to Print the logs Backup status

echo "$(date) log Backuped successfully ">>/root/logs/logbackup-status-$date.log

#Step -5 Function to call the Mail trigger to user


#Step -5 Function to trigger mail after log backup status

mail -s '$(date) Logs Moved Successfully $(hostname) - Successful' $mailid

HaydeezPluto 11-24-2012 10:37 AM

You can just redirect the black hole to the log file. It won't delete the file but will just clear the contents of the file.

cat /dev/null > filename.log
Add a condition to run the above code when a file size goes larger than, or number of lines go above x or everyday at midnight.

linosaurusroot 11-24-2012 01:16 PM


#!/usr/bin/perl -w


die("Usage: $0 filename-in-cwd") unless (defined($f));
die("filename contains improper chars") unless ($f =~ /^\w[\w\.-]+$/);

if ( (defined($ARGV[1])) && ($ARGV[1] =~ /^\d+$/) && ($ARGV[1] > 5000) ) {


if ($size > $max_size) {
    rename("$f.old1", "$f.old2");
    rename("$f.old", "$f.old1");
    rename($f, "$f.old") or die("could not rename file $f to $f.old");
    open(F, ">$f");

chrism01 11-26-2012 02:15 AM

Why not use the built-in tool 'logrotate' ?

shivaa 12-07-2012 01:03 PM

Used a simple workaround, side-by-side created one more script & added it to cron, which moves such log file (although I have more such scripts) once in a day at some point of time.

mv /home/jack/archive_logs/logfile /home/jack/archived_logs/logfile.$(date +%Y%m%d)

lleb 12-08-2012 02:22 PM

use the find command paired with -mtime:



find /path/to/logfile.log -mtime +1 -exec rm '{}' \;
put that as the last line right before the exit, this will look for any log that is older then 24hr and remote it from the system. As mentioned above setting the DATE as a timestamp for the logs is also a great idea. If you only want to keep one days worth then you might use something like this in the log name:


### Setting up variables.
dow=`date +%A`

### find command to clear log files older then one day

find ${LOG} -mtime +1 -exec rm '{}' \;

slowerogue 12-25-2012 10:55 PM

hi but this is sh file.
which means once i run it it wont quit even it done the job?
or should i cron this?

descendant_command 12-26-2012 12:00 AM

At top of script you could:

tail -n 23 /home/jack/logsfile > tmpfile && mv tmpfile /home/jack/logsfile
Then you always have the last 24 hour log.

All times are GMT -5. The time now is 01:33 PM.