Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I am working on an application that generates its own log files. These logs are stored under /var/log/<application_name> and under that in unique directories named using the current date, Using a custom application running under cron, all log files more than 24 hours old are zip'd into one file and the originals deleted daily. This leaves /var/log/<application_name> full of compressed files that will continue to grow. After a while this will fill the /var/log/ partition, so I am looking for a tool that can monitor the size of a directory and delete the oldest files if the directory passes a specified size or that can monitor the directory and delete files older than a specified age (e.g. 7 days). Can anyone help?
You could use the find command to locate older files and the delete them.
find /var/log/backups -mtime +2 -daystart -execdir rm '{}' \; 2>/dev/null
However, instead of what you are doing, you could use logrotate to compress the rotated files and use "rotate <n>" or "maxage <age>" to limit the number of backup logs used. Check if you have files in /etc/logrotate.d/. There is even an option to run a post rotate script, which could move the rotated file to /var/logs/backups/ and delete old log backups.
Just wondering, why you would use zip. bzip or gzip seem better choices and you can use tar with the -z or -j options to produce compressed archives.
unSpawn:
I will need to have a look at tmpwatch and see if it will do what I'm after.
jschiwal:
This is a long story, but I've been brought in to a company test their system for a third party (customer) and I will need to ask the developers why they are not using logRotate to acheive what they want, because they use it to manage a single system log but not for these data logs where they have written a routine themselves to archive the data logs every day. As for the zip :-) I should probably have said compress or archive. The archive routine does as you suggest use tar -z.
You could use the find command to locate older files and the delete them.
find /var/log/backups -mtime +2 -daystart -execdir rm '{}' \; 2>/dev/null
Can you explain what the \ does after the rm '{}' and what the 2>/dev/null does?
I think I understand the rest of the find command but I cannot make it work. I have created a number of files and used touch to change the modification date then run the find command as above but it did nothing. Am I doing something wrong?
The "\;" delineates the end of the -exec argument of the find command. The 2>/dev/null will redirect stderr to /dev/null. You want to do this in a cron job because it doesn't have access to a console.
I haven't really studied logrotate that much but using a post rotate script might be useful for your situation. If you copy the compressed rotated files to a different location (maybe even on a remote share) you can do that in such a scrit and even delete aged files. If the files are copied to a remote computer, it may be more "proper" to let that remote host take care of deleting the aged logs.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.