LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 02-17-2016, 11:16 PM   #1
Perseus
Member
 
Registered: Oct 2011
Posts: 179

Rep: Reputation: Disabled
Help to get files from last 24 hours automatically from another server


Hello to all,

I have 2 GNU Linux Servers, Server A and Server B.
I want to copy from Server A (Source server) to Server B (Destination server) the files from last 24 hours that are located in Server A in directory "/Files" to Server B in destination directory "/Last24hours"

I´d like to run the script to do the copy task automatically from Server B (Destination Server). I want to avoid run scripts in Server A (Source Server)

Is there a way to have a script in Server B that connects via SSH to Server A and copy the files of last 24 hours from "Server A: /Files"?

Manually I can connect to server A sending in Server B the command

Code:
ssh@192.168.x.x
and after that it asks for the password.

Thanks for any help in this.

Last edited by Perseus; 02-17-2016 at 11:18 PM.
 
Old 02-18-2016, 02:29 AM   #2
TenTenths
Senior Member
 
Registered: Aug 2011
Location: Dublin
Distribution: Centos 5 / 6 / 7
Posts: 3,475

Rep: Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553
find, rsync, ssh with keypair authentication, all of these are your friend.

What have you written so far? Or is this a homework question as it looks like one.
 
Old 02-18-2016, 01:10 PM   #3
Perseus
Member
 
Registered: Oct 2011
Posts: 179

Original Poster
Rep: Reputation: Disabled
Hello TenTenths,

Thanks for the suggestion, it is not at all anything related with homework. I'm not a student .
I didnt know about rsync command. I've been trying and using commands separately I think I'm close.

Maybe someone could help me with my issues.

(1) ssh and find command
With this combination of "ssh" and "find" I'm able to get the list of files of last 24 hours from remote server (it works):
Code:
ssh root@192.168.X.X 'find /SourceFilesInRemoteServer/ -mtime -1 -type f'
(2) rsync and ssh command
With "rsync" in combination with "ssh" I can copy from Remote Server to Local Server the files from directory "/SourceFiles" in this way (it works):
Code:
rsync -avzhe ssh root@192.168.X.X:/SourceFilesInRemoteServer/ "/DestinationFolderInLocalServer/"
Now my 2 issues are,

1) how to combine both previous commands in order that "rsync" (command #2) knows the list of last 24 hours given by "find" (command #1)?
2) How to do in order the script automatically put the password when ssh command is used and requests for the SSH password?

Thanks for any help

Last edited by Perseus; 02-18-2016 at 01:11 PM.
 
Old 02-18-2016, 01:15 PM   #4
Emerson
LQ Sage
 
Registered: Nov 2004
Location: Saint Amant, Acadiana
Distribution: Gentoo ~amd64
Posts: 7,661

Rep: Reputation: Disabled
Both machines on the same LAN just mount the source (or destination) over NFS, then it is down to copying from one directory to another. NFSv4 is said to be secure to use even over internet, haven't tried this. Another option is to run rsync server in one of machines.
 
Old 02-18-2016, 06:32 PM   #5
Perseus
Member
 
Registered: Oct 2011
Posts: 179

Original Poster
Rep: Reputation: Disabled
Hello Emerson,

I'm not able to install NFS in servers, I think the solution could be using ssh, find and rsync after the tests I did succesfully I mention previously.

Maybe somebody could help me to know how to use ssh, find and rsync to do this task.

Thanks again
 
Old 02-18-2016, 06:35 PM   #6
Emerson
LQ Sage
 
Registered: Nov 2004
Location: Saint Amant, Acadiana
Distribution: Gentoo ~amd64
Posts: 7,661

Rep: Reputation: Disabled
Usually NFS is part of default installation, you sure you do not have it? Anyhow, easiest is probably to run rsync server, Google has many good tutorials on it.
 
Old 02-18-2016, 11:00 PM   #7
Perseus
Member
 
Registered: Oct 2011
Posts: 179

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by Emerson View Post
Usually NFS is part of default installation, you sure you do not have it? Anyhow, easiest is probably to run rsync server, Google has many good tutorials on it.
Yes, NFS are not installed in servers.

I looked for on internet and I´ve done tests with rsync and works for me, but my problem is I cannot combine find and rsync in order to transfer only last 24 hours files from remote server.

Best regards
 
Old 02-19-2016, 02:52 AM   #8
Stéphane Ascoët
Member
 
Registered: Feb 2004
Location: Fleury-les-Aubrais, 120 km south of Paris
Distribution: Devuan, Debian, Mandrake, Freeduc (the one I used to work on), Slackware, MacOS X
Posts: 251

Rep: Reputation: 49
A simple way, but not very clean: ask Find to copy the files it found in another directory, and then put this directory as source in rsync command. Don't forget to empty it when done.
 
Old 02-19-2016, 05:15 AM   #9
TenTenths
Senior Member
 
Registered: Aug 2011
Location: Dublin
Distribution: Centos 5 / 6 / 7
Posts: 3,475

Rep: Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553
Quote:
Originally Posted by Perseus View Post
1) how to combine both previous commands in order that "rsync" (command #2) knows the list of last 24 hours given by "find" (command #1)?
2) How to do in order the script automatically put the password when ssh command is used and requests for the SSH password?
Approach this the other way round, sort out passwordless SSH first (there's a link in my sig to a post on my blog on how to do it).

Then consider something like this:

Code:
#!/bin/bash
FILELIST=$( ssh -i /root/.ssh/thekey root@serverA 'find /path/to/source/ -mtime -1 -type f -print0')
for FILE in ${FILELIST} ; do
  /usr/bin/rsync -azvh -e "ssh -i /root/.ssh/thekey" root@serverA:${FILE} /dest/path
done

The above script is inefficient as it makes a separate rsync connection for each file in the list.


Funkier way using scp:
Code:
#!/bin/bash
FILELIST=$( ssh -i /root/.ssh/thekey root@serverA 'find /path/to/files/ -mtime -1 -type f -printf "%p "')
scp -i /root/.ssh/thekey root@serverA:"${FILELIST}" /dest/path
 
Old 02-19-2016, 05:19 AM   #10
TenTenths
Senior Member
 
Registered: Aug 2011
Location: Dublin
Distribution: Centos 5 / 6 / 7
Posts: 3,475

Rep: Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553
Or if you really want to torture yourself:

Code:
#!/bin/bash
scp -i /root/.ssh/thekey root@serverA:"$( ssh -i /root/.ssh/thekey root@serverA 'find /path/to/files/ -mtime -1 -type f -printf "%p "')" /dest/path
Now I'm not sure how well these will work with your source files, there's zero escaping of problematic characters in files names like spaces etc. but it should certainly give you a couple of things to try.
 
Old 02-23-2016, 12:21 PM   #11
Perseus
Member
 
Registered: Oct 2011
Posts: 179

Original Poster
Rep: Reputation: Disabled
Hello again TenTenths,

I was able to make it work following your suggestion of making ssh password-less login first and using your solution "for FILE in ${FILELIST}...". It works if the files to transfer are about some MB, but the real files I need to transfer are about 5GB and rsync begins transfer but stops after 2 or 3 seconds. I think it could be because of the file size.

Now if rsync has issues with big files, I don't know if I need to move to use a script of FTP or another command utility that doesn't stucks with big files.

May you or somebody else suggest me what to do.

Thanaks again for help so far.

Regards
 
Old 02-23-2016, 12:26 PM   #12
pan64
LQ Addict
 
Registered: Mar 2012
Location: Hungary
Distribution: debian/ubuntu/suse ...
Posts: 21,842

Rep: Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308
rsync is specifically designed to transfer "big" files and also work on unstable/low quality network.
rsync has a backup/update feature itself which will allow you to automatically transfer changes only, do not need to use find. (if I understand it well).
 
Old 02-23-2016, 12:41 PM   #13
Perseus
Member
 
Registered: Oct 2011
Posts: 179

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by pan64 View Post
rsync is specifically designed to transfer "big" files and also work on unstable/low quality network.
rsync has a backup/update feature itself which will allow you to automatically transfer changes only, do not need to use find. (if I understand it well).
Hello Pan64,

I need to use find because I only want to copy files of last 24 hours one day a week. The files don't have nothing in their names that let me know the date of the file, so I need to check creation time using "find -mtime -1".

I set --progress option to rsync and stucks when has transfered less than 20MB. I even set --max-size='10g' and still stucks.

Regards
 
Old 02-23-2016, 12:53 PM   #14
pan64
LQ Addict
 
Registered: Mar 2012
Location: Hungary
Distribution: debian/ubuntu/suse ...
Posts: 21,842

Rep: Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308Reputation: 7308
so first execute find, save the result and tell rsync to use that file (see --files-from=) to specify what to transfer. Next use debug/verbose flags to find out what's happening (-v, -vv --log-file=, and --timeout).
Also check /var/log for low level (connection related) events on both sides. And also you may try -i too.
 
Old 02-23-2016, 01:25 PM   #15
Perseus
Member
 
Registered: Oct 2011
Posts: 179

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by pan64 View Post
so first execute find, save the result and tell rsync to use that file (see --files-from=) to specify what to transfer. Next use debug/verbose flags to find out what's happening (-v, -vv --log-file=, and --timeout).
Also check /var/log for low level (connection related) events on both sides. And also you may try -i too.
Hi pan64,

I've tried ypur suggestions and I've sent the rsync only for one file. It stopped at 127MB(28%). I've sent in this way

Code:
rsync -avzh --progress --max-size='10G' --log-file=/tmp/TransferLog.log  --timeout=30 --rsh=ssh iu@192.168.X.X:/SourceDirectory/filexyz.log /Destination/

rhes-5.5_64-ig_v3.3.2
receiving file list ...
1 file to consider
filexyz.log
     127.22M  28%   19.44MB/s    0:00:16
io timeout after 30 seconds -- exiting
rsync error: timeout in data send/receive (code 30) at io.c(200) [receiver=3.0.6]
rsync: connection unexpectedly closed (65 bytes received so far) [generator]
rsync error: error in rsync protocol data stream (code 12) at io.c(600) [generator=3.0.6]
and the log generated contains this:
Code:
2016/02/23 13:18:25 [14597] receiving file list
2016/02/23 13:18:25 [14597] 1 file to consider
2016/02/23 13:19:05 [14599] io timeout after 30 seconds -- exiting
2016/02/23 13:19:05 [14599] rsync error: timeout in data send/receive (code 30) at io.c(200) [receiver=3.0.6]
2016/02/23 13:19:05 [14597] rsync: connection unexpectedly closed (65 bytes received so far) [generator]
2016/02/23 13:19:05 [14597] rsync error: error in rsync protocol data stream (code 12) at io.c(600) [generator=3.0.6]
Thanks for the help.
 
  


Reply

Tags
automatically, ssh access, transfer



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
How to automatically restart a process every few hours ? Devisz Linux - Newbie 11 01-05-2011 04:21 PM
Automatically uploading files to server & security niels.horn Slackware 11 05-06-2009 02:36 PM
LXer: Backup MySQL databases, web server files to a FTP server automatically LXer Syndicated Linux News 0 08-11-2006 09:54 PM
Back-up files on a Macintosh OS9.0 to RH 9.0 server automatically JUSTAUSER Linux - Newbie 1 10-21-2003 10:28 AM
Automatically setting new files on the server as 755 gummyworms Linux - General 3 07-12-2001 11:32 PM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 07:41 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration