Help to get files from last 24 hours automatically from another server
ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Help to get files from last 24 hours automatically from another server
Hello to all,
I have 2 GNU Linux Servers, Server A and Server B.
I want to copy from Server A (Source server) to Server B (Destination server) the files from last 24 hours that are located in Server A in directory "/Files" to Server B in destination directory "/Last24hours"
I´d like to run the script to do the copy task automatically from Server B (Destination Server). I want to avoid run scripts in Server A (Source Server)
Is there a way to have a script in Server B that connects via SSH to Server A and copy the files of last 24 hours from "Server A: /Files"?
Manually I can connect to server A sending in Server B the command
Thanks for the suggestion, it is not at all anything related with homework. I'm not a student .
I didnt know about rsync command. I've been trying and using commands separately I think I'm close.
Maybe someone could help me with my issues.
(1) ssh and find command
With this combination of "ssh" and "find" I'm able to get the list of files of last 24 hours from remote server (it works):
Code:
ssh root@192.168.X.X 'find /SourceFilesInRemoteServer/ -mtime -1 -type f'
(2) rsync and ssh command
With "rsync" in combination with "ssh" I can copy from Remote Server to Local Server the files from directory "/SourceFiles" in this way (it works):
1) how to combine both previous commands in order that "rsync" (command #2) knows the list of last 24 hours given by "find" (command #1)?
2) How to do in order the script automatically put the password when ssh command is used and requests for the SSH password?
Both machines on the same LAN just mount the source (or destination) over NFS, then it is down to copying from one directory to another. NFSv4 is said to be secure to use even over internet, haven't tried this. Another option is to run rsync server in one of machines.
I'm not able to install NFS in servers, I think the solution could be using ssh, find and rsync after the tests I did succesfully I mention previously.
Maybe somebody could help me to know how to use ssh, find and rsync to do this task.
Usually NFS is part of default installation, you sure you do not have it? Anyhow, easiest is probably to run rsync server, Google has many good tutorials on it.
Usually NFS is part of default installation, you sure you do not have it? Anyhow, easiest is probably to run rsync server, Google has many good tutorials on it.
Yes, NFS are not installed in servers.
I looked for on internet and I´ve done tests with rsync and works for me, but my problem is I cannot combine find and rsync in order to transfer only last 24 hours files from remote server.
Location: Fleury-les-Aubrais, 120 km south of Paris
Distribution: Devuan, Debian, Mandrake, Freeduc (the one I used to work on), Slackware, MacOS X
Posts: 251
Rep:
A simple way, but not very clean: ask Find to copy the files it found in another directory, and then put this directory as source in rsync command. Don't forget to empty it when done.
1) how to combine both previous commands in order that "rsync" (command #2) knows the list of last 24 hours given by "find" (command #1)?
2) How to do in order the script automatically put the password when ssh command is used and requests for the SSH password?
Approach this the other way round, sort out passwordless SSH first (there's a link in my sig to a post on my blog on how to do it).
Then consider something like this:
Code:
#!/bin/bash
FILELIST=$( ssh -i /root/.ssh/thekey root@serverA 'find /path/to/source/ -mtime -1 -type f -print0')
for FILE in ${FILELIST} ; do
/usr/bin/rsync -azvh -e "ssh -i /root/.ssh/thekey" root@serverA:${FILE} /dest/path
done
The above script is inefficient as it makes a separate rsync connection for each file in the list.
Now I'm not sure how well these will work with your source files, there's zero escaping of problematic characters in files names like spaces etc. but it should certainly give you a couple of things to try.
I was able to make it work following your suggestion of making ssh password-less login first and using your solution "for FILE in ${FILELIST}...". It works if the files to transfer are about some MB, but the real files I need to transfer are about 5GB and rsync begins transfer but stops after 2 or 3 seconds. I think it could be because of the file size.
Now if rsync has issues with big files, I don't know if I need to move to use a script of FTP or another command utility that doesn't stucks with big files.
rsync is specifically designed to transfer "big" files and also work on unstable/low quality network.
rsync has a backup/update feature itself which will allow you to automatically transfer changes only, do not need to use find. (if I understand it well).
rsync is specifically designed to transfer "big" files and also work on unstable/low quality network.
rsync has a backup/update feature itself which will allow you to automatically transfer changes only, do not need to use find. (if I understand it well).
Hello Pan64,
I need to use find because I only want to copy files of last 24 hours one day a week. The files don't have nothing in their names that let me know the date of the file, so I need to check creation time using "find -mtime -1".
I set --progress option to rsync and stucks when has transfered less than 20MB. I even set --max-size='10g' and still stucks.
so first execute find, save the result and tell rsync to use that file (see --files-from=) to specify what to transfer. Next use debug/verbose flags to find out what's happening (-v, -vv --log-file=, and --timeout).
Also check /var/log for low level (connection related) events on both sides. And also you may try -i too.
so first execute find, save the result and tell rsync to use that file (see --files-from=) to specify what to transfer. Next use debug/verbose flags to find out what's happening (-v, -vv --log-file=, and --timeout).
Also check /var/log for low level (connection related) events on both sides. And also you may try -i too.
Hi pan64,
I've tried ypur suggestions and I've sent the rsync only for one file. It stopped at 127MB(28%). I've sent in this way
Code:
rsync -avzh --progress --max-size='10G' --log-file=/tmp/TransferLog.log --timeout=30 --rsh=ssh iu@192.168.X.X:/SourceDirectory/filexyz.log /Destination/
rhes-5.5_64-ig_v3.3.2
receiving file list ...
1 file to consider
filexyz.log
127.22M 28% 19.44MB/s 0:00:16
io timeout after 30 seconds -- exiting
rsync error: timeout in data send/receive (code 30) at io.c(200) [receiver=3.0.6]
rsync: connection unexpectedly closed (65 bytes received so far) [generator]
rsync error: error in rsync protocol data stream (code 12) at io.c(600) [generator=3.0.6]
and the log generated contains this:
Code:
2016/02/23 13:18:25 [14597] receiving file list
2016/02/23 13:18:25 [14597] 1 file to consider
2016/02/23 13:19:05 [14599] io timeout after 30 seconds -- exiting
2016/02/23 13:19:05 [14599] rsync error: timeout in data send/receive (code 30) at io.c(200) [receiver=3.0.6]
2016/02/23 13:19:05 [14597] rsync: connection unexpectedly closed (65 bytes received so far) [generator]
2016/02/23 13:19:05 [14597] rsync error: error in rsync protocol data stream (code 12) at io.c(600) [generator=3.0.6]
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.