LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 02-02-2005, 11:48 PM   #1
hk_linux
Member
 
Registered: Nov 2004
Location: India
Distribution: RedHat, PCQLinux, Fedora
Posts: 95

Rep: Reputation: 15
which is better- rsyncing number of files OR taking a tar and rsyncing it


hi all,
I need to backup a large number of files (nearly 100) from a server. The total size of the files will be around 150 MB.
Which will be a better method? Rsyncing the files individually (or) take a zip of the files in the source and then rsync it.
Will there be any performance issues, when large number of files are rsynced, because i think during the initial handshake itself, the list of files are first exchanged and then the actual files are transferred.
Please note that this is not an incremental backup. Every time, i need to get all those files to a new location.

Any help is appreciated.

TIA.
 
Old 02-03-2005, 02:20 AM   #2
korozion
Member
 
Registered: Apr 2004
Location: Canada
Distribution: Debian
Posts: 124

Rep: Reputation: 15
I (personally) would just gzip them (or tar and gzip) and just scp them to where I wanted them. Basically you can copy them to where you want, but compressing them would make the copy faster.
 
Old 02-03-2005, 02:52 AM   #3
hk_linux
Member
 
Registered: Nov 2004
Location: India
Distribution: RedHat, PCQLinux, Fedora
Posts: 95

Original Poster
Rep: Reputation: 15
hi korozion,
Probably scpying after taking gzip is a better option for my application. But out of curiosity, does rsync perform better if it has to transfer multiple small files or a single large file.
I want to add that, the zip will not reduce the size, because the files i am talking about are already zip files!!
 
Old 02-03-2005, 07:23 AM   #4
looseCannon
Member
 
Registered: Dec 2003
Location: Little Rock, AR
Distribution: Fedora Core 2, AIX, HP-UX, Solaris, Whitebox
Posts: 193

Rep: Reputation: 31
Are you going to be doing this through SSH or just through rsync by itself? The reason for that question is, if you do it through SSH then you have encryption and decryption going on, which will slow down the transfer some. This may not be an issue if you are on the same network as the machine you are pulling the file(s) off of.
 
Old 02-03-2005, 07:56 AM   #5
trickykid
LQ Guru
 
Registered: Jan 2001
Posts: 24,149

Rep: Reputation: 269Reputation: 269Reputation: 269
Use scp if you send just one file. rsync is good for multiple files or folders. Both I think have compression option when sending but if you use scp and are transferring a already compressed file, it probably won't make any difference when scp'ing it over.
 
Old 02-03-2005, 09:34 PM   #6
Jerre Cope
Member
 
Registered: Oct 2003
Location: Texas (central)
Distribution: ubuntu,Slackware,knoppix
Posts: 323

Rep: Reputation: 37
new location?

When you say "new location each time", do you really mean a new location each instance, or the same location?

The whole point to rsync is that it only backs up the portion the file that is required. This is not the same as an incremental backup to tape where you need several tapes for complete restore. Rsync simply avoids copying a file that already exists and is identical. After the initial copy, subsequent rsync's are much faster and require far less bandwidth than copying all the files each time. You can explictly invoke ssh as the transport for rsync, although I think the current version defaults to ssh.

I use an old 100 MHZ Dell with an 80 GB IDE drive to backup my server every 6 minutes. I also maintain a weekly copy and monthly copy. This same Dell also backups the user directories of 3 other Linux boxes every 15 minutes.

Don't keep copying the same data over and over again--even compressed.

Don't do it.
 
Old 02-03-2005, 11:52 PM   #7
hk_linux
Member
 
Registered: Nov 2004
Location: India
Distribution: RedHat, PCQLinux, Fedora
Posts: 95

Original Poster
Rep: Reputation: 15
Jerre cope,
I thought of rsyncing the files, burn it to a CD and remove the files. Then start the rsync process all over once again. Thats wat i meant by new location each time.
As you point out, rsync works best for incremental backups. But the files i am planning to backup are log files, which get log rotated as ".gz" files. I retain 10 such files. I am planning to take these backups every month. So there will always (well almost) be a different set of log files each month. So it doesnt really matter, if i use rsync or scp. There are 8-9 such log files. So i will have 80-90 compressed files for backup.
So my problem boils down to whether to get those 80-90 files through rsync or zip those files and rsync a single file. I am asking this out of curiosity I have already decided to transfer the 80-90 files, becoz zipping them will again take an equal amount of space in the server.

Loosecannon,
The machines are in the same network. I am planning to use rsync over ssh.
 
Old 02-04-2005, 09:17 AM   #8
Jerre Cope
Member
 
Registered: Oct 2003
Location: Texas (central)
Distribution: ubuntu,Slackware,knoppix
Posts: 323

Rep: Reputation: 37
time

I don't think the difference in time would be humanly discernable.

If you have user traffic on the network 24/7 where the slight delay of copying a large file would be noticed, then I would suggust using rsync daily ( without using the --delete option) so that you pull less across the network at a time. When you get close to 600 MB, burn the CD and remove all but the most recent files (so you don't rsync them again.)
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
a tough question 4 u, problem in extracting tar & tar.gz files p_garg Linux - General 5 11-08-2010 11:02 AM
how to install .tar.bz and src.rpms and tar.gz files gadekishore Linux - Software 1 10-12-2005 08:09 PM
Rsyncing between windows client and gentoo server, over ssh, using key for auth. Passive Linux - Networking 0 08-03-2005 11:05 AM
RSYNCing servers MikeAtVillage Linux - Networking 1 05-05-2005 03:20 PM
Large tar file taking huge disk space in ext3 file system pcwulf Linux - General 2 10-20-2003 07:45 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 10:09 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration