I'd recommend
Dar . It consists of a binary component, with scripts to control backups. It handles full and differential backups.
Like tar, it will make one large backup, if you want it to, or cut the backup into slices for writing to zip, jaz, cd, dvd, whatever. The 'whatever' is controled via the scripts.
Unlike tar, it doesn't make one large compressed backup, it makes one large backup of individually compressed files. If one is corrupt, you risk loosing only one. In some cases, Dar has a skip-ahead feature that can skip over corrupt portions of a file and restore the major portion (hopefully you can restore the missing parts from other sources). I've backed up a 7.7 Gig installation to 2.7 Gig in the backup.
Dar allows you to specify which files to NOT compress, because they're already compressed. Such files are flagged by file extension.
Dar also has a list feature to list the contents of a backup. Files flaged as WORSE are larger after compression than before. The script can then be edited to flag those files, by extension, to not compress.
Directories with contents that you don't want backed up (such as /proc and /devfs) can be pruned: i.e., preserve the directory structure, but don't backup the files in them.
If you decide to try it and have problems, feel free to PM me, and I can send you my copy of the TUTORIAL, which I've edited to include comments from the author, and commands I've tried that worked.
Dar includes a static version of itself (with all necessary library files) which can be included in a backup. In the event of some catastrophe, the static version can be used to run the restore.
You have the option of using the .duc (dar user control) script to control what gets backed up, and how, or writing your own bash script with all the options on one long line (escaped with \ at the end of each line before pressing enter).