Latest LQ Deal: Complete CCNA, CCNP & Red Hat Certification Training Bundle
Go Back > Forums > Linux Forums > Linux - Newbie
User Name
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!


  Search this Thread
Old 11-18-2008, 06:25 AM   #1
Registered: Aug 2006
Posts: 168

Rep: Reputation: 15
walking a file path and compressing files (except for today)

How would I got about using a bash script to walk a few file paths and compress files which are not compressed? Maybe run a daily cron job and archive all logs except for today.

file structure as follows:


I can walk the path as follows:

find /var/log/hosts -d | sed '1d'

Old 11-18-2008, 06:37 AM   #2
Gentoo support team
Registered: May 2008
Location: Lucena, Córdoba (Spain)
Distribution: Gentoo
Posts: 4,074

Rep: Reputation: 387Reputation: 387Reputation: 387Reputation: 387
You can check if the file is compressed in a number of ways, that range from the easiest checks using the file extensio to the output of the command file and many others. You can also instruct find to find only files newer than a given date. Check -mtime -ctime and -atime.

I would also check logrotate. It's what most people use and it's designed exactly for this task.
Old 11-27-2008, 12:17 AM   #3
Registered: Aug 2006
Posts: 168

Original Poster
Rep: Reputation: 15
Thanks for the response.

The following will produce a list of all files.

find /var/log/hosts/* -type f > files.tmp

gzip $filename
I guess I could send this to a tmp file as files.tmp then pipe it to the gzip command?

I just need the file replace in the same location with the compressed version. Nothing fancy.

Please forgive me as I am a newbie.

Old 12-08-2008, 12:04 AM   #4
Registered: Aug 2006
Posts: 168

Original Poster
Rep: Reputation: 15
I have managed to generate a file list of paths to the files I need to compress. Now I need to step though each path and filename, and compress the file.

The 2nd loop does not work yet, so its just an analogy of what I am trying to do.

for path in `find /var/log/hosts/*/2008/11/* -type f`; do
        echo $path >> ./tmp/path.tmp

# Extract directory path
cut -d"/" -f1,2,3,4,5,6,7,8 ./tmp/path.tmp >> ./tmp/path1.tmp
# Extract filename
cut -d"/" -f9 ./tmp/path.tmp >> ./tmp/file.tmp

# Put into Loop for each path and file name

for path1 in ./tmp/path1.tmp; do

# Change into directory
cd $path1
# Tar file and gzip
tar cf - $file | gzip -c > $file.tgz
# Change back to working directory
cd /home/hanzo/bin/



Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
Compressing VOB files eplanamente Linux - Software 6 01-04-2006 12:15 PM
Mandrake and compressing files stevsom Linux - Software 1 06-06-2005 08:52 PM
compressing files master Linux - Newbie 4 03-18-2004 12:48 PM
zipping/compressing folders and files DiZASTiX Linux - Newbie 1 05-26-2003 09:48 PM
Compressing files GtkUser Linux - Software 1 04-02-2003 07:21 AM > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 03:40 AM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration