LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 09-24-2015, 10:15 PM   #1
Defested
LQ Newbie
 
Registered: Mar 2015
Posts: 9

Rep: Reputation: Disabled
Backup script with built in compression while maintaining folder structure


I am trying to script a backup of some of my media drives. I would want this backup to maintain the same folder structure as the source, but gzip and files that are larger than 50MB

The method I am using now is something along the lines of
Code:
 rsync -r --progress --update /test/* /test2/
find ./. -size +50M
However, this requires that all of the space of the files be used right away, and then reduced. I am looking for a way for the files to be compressed in transit, while at the same time having rsync only copy the updated files.
 
Old 09-25-2015, 05:32 AM   #2
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Quote:
Originally Posted by Defested View Post
I am looking for a way for the files to be compressed in transit, while at the same time having rsync only copy the updated files.
That's not how rsync works or is supposed to work. If you still want to use rsync then IMHO your only option is to use a remote file system that performs transparent compression like Btrfs or anything FUSE-based like ZFS or FuseCompress. Or performance / support-wise better: a remote OS that provides ZFS natively.
 
Old 09-25-2015, 09:30 PM   #3
Defested
LQ Newbie
 
Registered: Mar 2015
Posts: 9

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by unSpawn View Post
That's not how rsync works or is supposed to work. If you still want to use rsync then IMHO your only option is to use a remote file system that performs transparent compression like Btrfs or anything FUSE-based like ZFS or FuseCompress. Or performance / support-wise better: a remote OS that provides ZFS natively.
If I didn't use rsync, would there be another way?
 
Old 09-26-2015, 01:13 AM   #4
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Quote:
Originally Posted by Defested View Post
If I didn't use rsync, would there be another way?
Possibly, but not one I know of or could even advise you about.
 
Old 09-26-2015, 02:16 AM   #5
syg00
LQ Veteran
 
Registered: Aug 2003
Location: Australia
Distribution: Lots ...
Posts: 21,128

Rep: Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120
Have a read of https://wiki.archlinux.org/index.php/Backup_programs - more than you probably want to know.
 
Old 09-26-2015, 05:25 PM   #6
jmgibson1981
Senior Member
 
Registered: Jun 2015
Location: Tucson, AZ USA
Distribution: Debian
Posts: 1,141

Rep: Reputation: 392Reputation: 392Reputation: 392Reputation: 392
Can I suggest incremental backups instead of trying to gzip large files? This uses hard links. Will only backup each file once unless it gets changed. From my manager script. I have this run every hour on a number of directories on my laptop to backup to my server. Wifi both ends takes 10 seconds to do the "backup" on my 2gb /home/$USER folder.

Code:
data_backup() { # versioned backups $1 = name of backed up folder | $2 = /path/to/folder being backed up
	if ping -c 1 "$SERVER" ; then
		if [[ -e "$2" ]] ; then
			if ssh "$USER"@"$SERVER" "[[ -e ${DATABACKUP}/${1}/current ]]" ; then
				rsync -"$RSYNCOPT" --link-dest="$DATABACKUP"/"$1"/current "$2" "$USER"@"$SERVER":"$DATABACKUP"/"$1"/"$1"."$NOW"
				ssh "$USER"@"$SERVER" "rm -f ${DATABACKUP}/${1}/current"
			else
				rsync -"$RSYNCOPT" "$2" "$USER"@"$SERVER":"$DATABACKUP"/"$1"/"$1"."$NOW"
			fi
			ssh "$USER"@"$SERVER" "ln -s ${DATABACKUP}/${1}/${1}.${NOW} ${DATABACKUP}/${1}/current"
			ssh "$USER"@"$SERVER" "find ${DATABACKUP}/${1}/ -type d -mtime +${DATADAYS} -exec rm -fr {} \;"
		fi
	fi
}

Last edited by jmgibson1981; 09-26-2015 at 05:26 PM.
 
  


Reply

Tags
backup, gzip, rsync



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Script that rebuilds folder structure of a accidentally deleted disk alvgarci Programming 3 02-13-2013 03:15 PM
Copy or scp specific files in sub directories maintaining same tree structure...??? Drigo Linux - Newbie 3 09-26-2012 11:09 AM
Copy certain file types recursively while maintaining file structure on destination? rockf1bull Linux - Newbie 1 06-14-2011 09:28 AM
Newbie trying to write a simple backup script to backup a single folder Nd for school stryker759a Linux - Newbie 2 09-16-2009 08:52 AM
Home Jail Folder Structure like Gobolinux Directory Structure luispt Linux - General 3 07-26-2008 06:46 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 11:25 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration