LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   Need Scripting Help To Manage Old Logs (https://www.linuxquestions.org/questions/linux-newbie-8/need-scripting-help-to-manage-old-logs-935487/)

oldyankee 03-20-2012 10:27 AM

Need Scripting Help To Manage Old Logs
 
I need to purge old WebSphere log files that are a certain number of days old (or older) and that have the date appended to the end of the filenames, e.g.,
WASAppName.log.2012-03-20
WASApp2_log4j.log.2012-02-15
.
.
I know in a C program I could use something like the IsDate() function to test the last 10 characters of each file to make sure it is a date before deleting it.

I know I can use the find command to filter out files that are older than 10 days:

find /logdir -mtime +10 -exec ls -l {} \;

I have piped the output of this find command to a file which gives me a nice list of all of the log files (including their fully-qualified paths) in the specified directory that are more than 10 days old. Unfortunately that list also contains the current log file that WAS has open and is writing to and I sure don't even want to try to delete that one.

At this point, I am thinking that perhaps awk and/or sed might be used to
(1) parse this file
(2) read each entry
(3) verify that the last 10 characters constitute a valid date
(4) delete the file if (3) is true
(5) go get next entry

I am not a sed or awk knowledgeable person and would appreciate any help or suggestions anyone might have to offer. (Including alternative techniques for purging all archived logs that are older than 10 days.)

TB0ne 03-20-2012 10:40 AM

Quote:

Originally Posted by oldyankee (Post 4631649)
I need to purge old WebSphere log files that are a certain number of days old (or older) and that have the date appended to the end of the filenames, e.g.,
WASAppName.log.2012-03-20
WASApp2_log4j.log.2012-02-15
.
.
I know in a C program I could use something like the IsDate() function to test the last 10 characters of each file to make sure it is a date before deleting it. I know I can use the find command to filter out files that are older than 10 days:

find /logdir -mtime +10 -exec ls -l {} \;

I have piped the output of this find command to a file which gives me a nice list of all of the log files (including their fully-qualified paths) in the specified directory that are more than 10 days old. Unfortunately that list also contains the current log file that WAS has open and is writing to and I sure don't even want to try to delete that one.

At this point, I am thinking that perhaps awk and/or sed might be used to
(1) parse this file
(2) read each entry
(3) verify that the last 10 characters constitute a valid date
(4) delete the file if (3) is true
(5) go get next entry

I am not a sed or awk knowledgeable person and would appreciate any help or suggestions anyone might have to offer. (Including alternative techniques for purging all archived logs that are older than 10 days.)

...or you can just use logroate and/or logwatch, and have them do what you want. No need to reinvent the wheel. You don't say what version/distro of Linux you're using, but if those utilities aren't installed already, you can probably install them from your online repositories.

oldyankee 03-20-2012 03:10 PM

Sorry - should have said RHEL AS 6 and SuSe Ent 9.2 (some of both).

I am new here and the folks here from long before I arrived tell me they tried logrotate but ran into all kinds of problems and inconsistencies. Also WAS is doing some of the log rotations through its built-in mechanisms and many logs are created and backed up by multiple apps that simply close the current log and rename it each day and open a new one (yeah - they should purge old ones from the app, but that's a can 'o worms that the new guy does not want to open as yet). Wouldn't "wires be crossed" if I were to simply unleash logrotate on these same log files? I guess I figured writing a purge command and executing it via cron would be the most friendly approach for this environment.

TB0ne 03-20-2012 03:35 PM

Quote:

Originally Posted by oldyankee (Post 4631876)
Sorry - should have said RHEL AS 6 and SuSe Ent 9.2 (some of both).

I am new here and the folks here from long before I arrived tell me they tried logrotate but ran into all kinds of problems and inconsistencies. Also WAS is doing some of the log rotations through its built-in mechanisms and many logs are created and backed up by multiple apps that simply close the current log and rename it each day and open a new one (yeah - they should purge old ones from the app, but that's a can 'o worms that the new guy does not want to open as yet). Wouldn't "wires be crossed" if I were to simply unleash logrotate on these same log files? I guess I figured writing a purge command and executing it via cron would be the most friendly approach for this environment.

Well, I'd find out first what "they" tried in the past, and what problems they encountered. Could very well be something wasn't configured correctly, and they put the blame on the software. Logrotate can be simple or VERY complex, but it sounds like it should do exactly what you want.

chrism01 03-20-2012 05:53 PM

Yeah, I'd go with logrotate; it's a very solid tool and you shouldn't have any 'issues'.
The only thing you need to worry about (and this is true even if you handcode something) is not interfering with another process (even the APP itself) if its already doing something with those logs.
You just need to be clear what (if anything) that is.

FYI; logrotate is part of the default install of RHEL; probably Suse too ...
Easiest thing is too look at current logrotate settings (RHEL)

/etc/logrotate.conf
/etc/logrotate.d

and check the man page here http://linux.die.net/man/8/logrotate

Tinkster 03-20-2012 06:37 PM

What the others said: logrotate is a great tool, and fairly customizable; and as
mentioned above: the only potential problem are long running processes that don't
release/rotate/chunk their own logs.

You mentioned a CURRENT log that had a mtime of 10+ days (which is kind of weird,
I've never known WAS to not be garrulous), so that may not be a good candidate for
rotation via logrotate.


Cheers,
Tink

arashi256 03-20-2012 07:01 PM

I wrote this. Works for me and for any file location. I don't know if logrotate could do the same thing, but it was more fun to write this :D Hope it helps!

Code:

#!/bin/bash
DAYS_TO_KEEP=$1
echo -e "--------------------------------------------------\n"
echo "Days of file(s} to keep: $DAYS_TO_KEEP day(s)"
FILE_DIRECTORY=$2
echo "File directory location: $FILE_DIRECTORY"
TEMP_DIRECTORY=$3
echo "Temp directory: $TEMP_DIRECTORY"
touch ${TEMP_DIRECTORY}/OLDER-FILES.tmp
find ${FILE_DIRECTORY} -type f -mtime +${DAYS_TO_KEEP} > ${TEMP_DIRECTORY}/OLDER-FILES.tmp
FILE_COUNT=`wc -l ${TEMP_DIRECTORY}/OLDER-FILES.tmp | awk '{ print $1 }'`
echo "File Count over threshold: ${FILE_COUNT}"
if [ "$FILE_COUNT" -gt "0" ]; then
echo "Old log files to delete: ${FILE_COUNT}"
IFS=$'\n'
for filename in $(cat ${TEMP_DIRECTORY}/OLDER-FILES.tmp)
do
rm -f ${filename}
DELETED_COUNT=`expr $DELETED_COUNT + 1`
echo "Old log file ${DELETED_COUNT}/${FILE_COUNT} ${filename} - deleted."
done
else
echo "No log files older than ${DAYS_TO_KEEP} days. Exiting..."
fi
rm -f ${TEMP_DIRECTORY}/OLDER-FILES.tmp
echo "Done - ${DELETED_COUNT} file(s) were deleted"
echo -e "--------------------------------------------------\n"

To use, it’s very simple. Assuming I’ve saved the above code as “file-purge.sh” and I’m intent on deleting files that are over 93 days old I’d use the following parameters: -

Code:

file-purge.sh 93 [File directory to examine] [a temporary directory to use like $HOME/tmp]
Obviously, if logrotate can do this for any arbitrary location of files, I've probably wasted my time but hey...:)

oldyankee 03-22-2012 05:46 PM

Anytime anyone writes code for whatever reason - they are NOT wasting their time. They are exercising their brain and helping to stave off Alzheimer's in the future!
Kudos.

Tinkster 03-22-2012 07:00 PM

Indeed. But the caveat re the open log file with an mtime +10 remains with
that script as it does w/ logrotate.


All times are GMT -5. The time now is 07:29 PM.