You could use find's -print0 option, and pipe the output to 'xargs -0'. Look at the options for xargs, as it has options that limit the number of arguments that fill the argv array. This prevents a bash out of memory error.
Another option, if you want to backup from a list of files, is to use the `tr' command to replace newlines to nulls, and pipe the output to "xargs -0"
tr <filelist '\n' '0' | xargs -0 -L 1000 cp --target-directory="${target-directory}"
---
Also consider using tar or rsync to replicate files to a backup. Tar can use a snapshot file so that only new files are copied.
BACKUPDIR=/mnt/lacie/backups/
tar -C /home/sambashare/ -g timestamp.snar -cf - . | tar -C "${BACKUPDIR}/ -xvf - >logfile
Suppose you have a fileserver where you save backups to. You can backup to it via ssh even if it is somewhere out on the Internet:
eval $(ssh-agent)
ssh-add
<password>
tar -C /home/sambashare/ -g timestamp.snar -cf - . | ssh user@mrbackup tar -C "${BACKUPDIR}/" -xvf - >logfile
You want to use public key authentication for this.
Last edited by jschiwal; 07-17-2009 at 09:58 AM.
|