LinuxQuestions.org
Visit Jeremy's Blog.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 03-06-2010, 03:40 AM   #1
salimshahzad
Member
 
Registered: Dec 2009
Posts: 200

Rep: Reputation: 15
test data transfer from linux to linux box pc


dear gurus

i build a new test machine where i need to bring data from live machine. the data is kind of flat files and some propreitary application axigen mail server.

now what i am suffering from, which commands to do first practise. there is remote site with 1mps speed of wireless between live and test machine. on daily basis aprox 14gb .tar.gz files it need to move it.

i found scp,rcp,rsync,sftp etc. which is fastest way to replicate or copy to remote machines.

the data is on live machine /var/opt/application and on remote same directory too /var/opt/application

i try using scp it take aprox 8-10 hours to copy single 14gb file.

if possible where to see such commands logs results, if anything get down error discontinue while copying.

regards
salim
 
Old 03-06-2010, 11:30 AM   #2
TB0ne
LQ Guru
 
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 18,811

Rep: Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190
Quote:
Originally Posted by salimshahzad View Post
dear gurus

i build a new test machine where i need to bring data from live machine. the data is kind of flat files and some propreitary application axigen mail server.
Again, your posts are very hard to read and understand.

First is what kind of data you're moving. All we know is "kind of flat files", and "some proprietary" data (assuming you mean data, and not copying the whole axigen mail server, as you say).
Quote:
now what i am suffering from, which commands to do first practise. there is remote site with 1mps speed of wireless between live and test machine. on daily basis aprox 14gb .tar.gz files it need to move it.
Which "commands to do first practice"? Well, again, it depends on the data types. If you can compress it tightly, I'd chain the tar/gzip commands into scp, so you're transferring the compressed data over, rather than the full 14GB. Good compress can cut transfer time. If the data won't compress tightly, then sending uncompressed takes less effort.
Quote:
i found scp,rcp,rsync,sftp etc. which is fastest way to replicate or copy to remote machines.
The best way for you to find out the fastest way, is to try them all, and time them. Get a test data set, and transfer it using every different method.
Quote:
the data is on live machine /var/opt/application and on remote same directory too /var/opt/application
The directory it's in doesn't matter a bit.
Quote:
i try using scp it take aprox 8-10 hours to copy single 14gb file.
With no information about the network in between the boxes, that's either good or bad. If you've got a calculator, you can easily figure out which it is.
Quote:
if possible where to see such commands logs results, if anything get down error discontinue while copying.
regards
salim
That depends on how you run the copy job. If it's via CRON, are you redirecting the output to a specific file? And "if anything get down error", is received, you'll see it, only if you're redirecting everything. If you're running it on a console, you'll see the errors there.
 
Old 03-06-2010, 02:06 PM   #3
schneidz
LQ Guru
 
Registered: May 2005
Location: boston, usa
Distribution: fc-15/ fc-20-live-usb/ aix
Posts: 5,111

Rep: Reputation: 874Reputation: 874Reputation: 874Reputation: 874Reputation: 874Reputation: 874Reputation: 874
Quote:
Originally Posted by salimshahzad View Post
... there is remote site with 1mps speed of wireless between live and test machine. on daily basis aprox 14gb .tar.gz files it need to move it.

i found scp,rcp,rsync,sftp etc. which is fastest way to replicate or copy to remote machines...
wont 1 mbps be 1 mbps no matter what protocol you use ?
 
Old 03-06-2010, 05:08 PM   #4
TB0ne
LQ Guru
 
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 18,811

Rep: Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190Reputation: 4190
Quote:
Originally Posted by schneidz View Post
wont 1 mbps be 1 mbps no matter what protocol you use ?
Yes, but some protocols get closer to using that full limit than others. Some have more overhead.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
transfer data from linux to windows simer_anand88 Linux - Newbie 2 12-09-2009 01:21 AM
Troubleshooting Slow Transfer Speed Between Linux Box and Windows Box timswim78 Linux - General 1 10-23-2005 11:45 AM
data Transfer from Windows PC to Linux PC activeasim Linux - Software 1 08-07-2005 09:21 AM
Data Transfer from Linux to Windows cynicall Linux - Newbie 5 10-13-2004 04:44 PM
Best way to transfer data from Windows to Linux? jcmaco Mandriva 1 11-07-2003 08:51 PM


All times are GMT -5. The time now is 05:57 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration