Hi,
i have a root-server (debian 64bit lenny) with a few domains running on it. every sunday i run a full backup of all domain-files on the hard-disk.
but why is the "compressed" tar.bz2 file bigger than its content?
the compressed tar.bz2 file from my backup-script:
Code:
3.8G Jan 16 04:12 backup-20110116030001-full.tar.bz2
the extracted directory:
Code:
# du -sh backup-20110116030001-full
2.8G backup-20110116030001-full
now i create a new tar.bz2 and archive the extracted folder again in a new archive. it is the same folder extracted from the 3.8G archive, but now its only 1.9GB!:
Code:
# tar -cjpf backup-20110116030001-full_NEW.tar.bz2 backup-20110116030001-full
1.9G Jan 18 10:30 backup-20110116030001-full_NEW.tar.bz2
the only difference is that my backup-script uses the -T (or --files-from) parameter to read all files from a list. so first i "find -mtime" all the files and then I create the archive.
does the parameter -T not compress? is there a alternative solution?
thx!
kim