LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (http://www.linuxquestions.org/questions/linux-software-2/)
-   -   logRotate on a directory (http://www.linuxquestions.org/questions/linux-software-2/logrotate-on-a-directory-646900/)

Twelveone 06-04-2008 05:14 AM

logRotate on a directory
 
Hi all,

I am working on an application that generates its own log files. These logs are stored under /var/log/<application_name> and under that in unique directories named using the current date, Using a custom application running under cron, all log files more than 24 hours old are zip'd into one file and the originals deleted daily. This leaves /var/log/<application_name> full of compressed files that will continue to grow. After a while this will fill the /var/log/ partition, so I am looking for a tool that can monitor the size of a directory and delete the oldest files if the directory passes a specified size or that can monitor the directory and delete files older than a specified age (e.g. 7 days). Can anyone help?

Many thanks

unSpawn 06-04-2008 05:49 AM

Quote:

Originally Posted by Twelveone (Post 3174240)
I am looking for a tool that can monitor the size of a directory and delete (..) files older than a specified age (e.g. 7 days).

Size is more like 'logrotate', for aging see 'tmpwatch'.

jschiwal 06-04-2008 06:56 AM

You could use the find command to locate older files and the delete them.
find /var/log/backups -mtime +2 -daystart -execdir rm '{}' \; 2>/dev/null

However, instead of what you are doing, you could use logrotate to compress the rotated files and use "rotate <n>" or "maxage <age>" to limit the number of backup logs used. Check if you have files in /etc/logrotate.d/. There is even an option to run a post rotate script, which could move the rotated file to /var/logs/backups/ and delete old log backups.

Just wondering, why you would use zip. bzip or gzip seem better choices and you can use tar with the -z or -j options to produce compressed archives.

Twelveone 06-04-2008 08:38 AM

Hi,

Thanks for the replies.

unSpawn:
I will need to have a look at tmpwatch and see if it will do what I'm after.

jschiwal:
This is a long story, but I've been brought in to a company test their system for a third party (customer) and I will need to ask the developers why they are not using logRotate to acheive what they want, because they use it to manage a single system log but not for these data logs where they have written a routine themselves to archive the data logs every day. As for the zip :-) I should probably have said compress or archive. The archive routine does as you suggest use tar -z.

Thanks Again

Twelveone 06-04-2008 10:56 AM

Quote:

Originally Posted by jschiwal (Post 3174311)
You could use the find command to locate older files and the delete them.
find /var/log/backups -mtime +2 -daystart -execdir rm '{}' \; 2>/dev/null

Can you explain what the \ does after the rm '{}' and what the 2>/dev/null does?

I think I understand the rest of the find command but I cannot make it work. I have created a number of files and used touch to change the modification date then run the find command as above but it did nothing. Am I doing something wrong?

Thanks

jschiwal 06-04-2008 08:03 PM

The "\;" delineates the end of the -exec argument of the find command. The 2>/dev/null will redirect stderr to /dev/null. You want to do this in a cron job because it doesn't have access to a console.

I haven't really studied logrotate that much but using a post rotate script might be useful for your situation. If you copy the compressed rotated files to a different location (maybe even on a remote share) you can do that in such a scrit and even delete aged files. If the files are copied to a remote computer, it may be more "proper" to let that remote host take care of deleting the aged logs.


All times are GMT -5. The time now is 06:26 AM.