Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have one server where the program logging the access count in one log format and it creates on each day.
the task is i want to run a script in which need to extract the hit count from each of the day.
audit.log
audit.log_date.gz
audit.log_2ndday.gz
these are the syntax of the log file. from each log (for the last 7 days) i need to extract the count. and combine it.
Ok. The urgency is on your side, not on the rest of us who volunteer here.
Either perl or awk will be a good match for your task.
However, what have you tried so far and where are you stuck? We can walk you through the hard parts. Please provide samples of your code and a few lines of sample input.
why zcat? have you actually tried that? audit.log would not aoppear to be zipped (or archived).
and what's wrong with that approach? if that works, you just have to count the lines, no?
Ok. The urgency is on your side, not on the rest of us who volunteer here.
Either perl or awk will be a good match for your task.
However, what have you tried so far and where are you stuck? We can walk you through the hard parts. Please provide samples of your code and a few lines of sample input.
Yes Understood. It would be helpful if i get an answer asap. That is what i meant.
i am confused to write a script because each log has generated as below
why zcat? have you actually tried that? audit.log would not aoppear to be zipped (or archived).
and what's wrong with that approach? if that works, you just have to count the lines, no?
Thanks for your response.
Some of the files are .gz. zcat is used for monthly report generation. any log file above 10 days old are converting as .gz.
You can only use zcat on gzipped files and cat on plain text.
But about the main question, do you have a list of files which include a date in the name and you would like to get the latest seven of those file names?
If each file's time stamp matches the name, the utility ls has option -t and -r which might be useful if combined with head.
Otherwise you will have to parse the name, in awk or perl.
Again, what have you tried and where are you stuck?
Last edited by Turbocapitalist; 02-18-2019 at 04:42 AM.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.