Need Scripting Help To Manage Old Logs
I need to purge old WebSphere log files that are a certain number of days old (or older) and that have the date appended to the end of the filenames, e.g.,
WASAppName.log.2012-03-20 WASApp2_log4j.log.2012-02-15 . . I know in a C program I could use something like the IsDate() function to test the last 10 characters of each file to make sure it is a date before deleting it. I know I can use the find command to filter out files that are older than 10 days: find /logdir -mtime +10 -exec ls -l {} \; I have piped the output of this find command to a file which gives me a nice list of all of the log files (including their fully-qualified paths) in the specified directory that are more than 10 days old. Unfortunately that list also contains the current log file that WAS has open and is writing to and I sure don't even want to try to delete that one. At this point, I am thinking that perhaps awk and/or sed might be used to (1) parse this file (2) read each entry (3) verify that the last 10 characters constitute a valid date (4) delete the file if (3) is true (5) go get next entry I am not a sed or awk knowledgeable person and would appreciate any help or suggestions anyone might have to offer. (Including alternative techniques for purging all archived logs that are older than 10 days.) |
Quote:
|
Sorry - should have said RHEL AS 6 and SuSe Ent 9.2 (some of both).
I am new here and the folks here from long before I arrived tell me they tried logrotate but ran into all kinds of problems and inconsistencies. Also WAS is doing some of the log rotations through its built-in mechanisms and many logs are created and backed up by multiple apps that simply close the current log and rename it each day and open a new one (yeah - they should purge old ones from the app, but that's a can 'o worms that the new guy does not want to open as yet). Wouldn't "wires be crossed" if I were to simply unleash logrotate on these same log files? I guess I figured writing a purge command and executing it via cron would be the most friendly approach for this environment. |
Quote:
|
Yeah, I'd go with logrotate; it's a very solid tool and you shouldn't have any 'issues'.
The only thing you need to worry about (and this is true even if you handcode something) is not interfering with another process (even the APP itself) if its already doing something with those logs. You just need to be clear what (if anything) that is. FYI; logrotate is part of the default install of RHEL; probably Suse too ... Easiest thing is too look at current logrotate settings (RHEL) /etc/logrotate.conf /etc/logrotate.d and check the man page here http://linux.die.net/man/8/logrotate |
What the others said: logrotate is a great tool, and fairly customizable; and as
mentioned above: the only potential problem are long running processes that don't release/rotate/chunk their own logs. You mentioned a CURRENT log that had a mtime of 10+ days (which is kind of weird, I've never known WAS to not be garrulous), so that may not be a good candidate for rotation via logrotate. Cheers, Tink |
I wrote this. Works for me and for any file location. I don't know if logrotate could do the same thing, but it was more fun to write this :D Hope it helps!
Code:
#!/bin/bash Code:
file-purge.sh 93 [File directory to examine] [a temporary directory to use like $HOME/tmp] |
Anytime anyone writes code for whatever reason - they are NOT wasting their time. They are exercising their brain and helping to stave off Alzheimer's in the future!
Kudos. |
Indeed. But the caveat re the open log file with an mtime +10 remains with
that script as it does w/ logrotate. |
All times are GMT -5. The time now is 07:29 PM. |