backing up data via ssh
Hi,
I have two computers on a network, say 192.168.1.10 and 192.168.1.20. Using Fedora's "connect to Server via SSH" I mounted my home fold on the .20 server on the .10 one. I was wondering if it's possible to automate the copying of my htdocs folder from .10 to a local backup folder, and from there to the .20 machine. I couldn't get samba or nfs to work ... and the SSH volume mounts and I can drag stuff over. Any help would be greatly appreciated. |
Hi
When you have ssh access, you can use scp to copy from computer to computer. It works like the cp command, but prefix with "username@computer:" on the arguments. E.g. On source computer: scp -r /var/www/htdocs username@192.168.1.20:~/backups/ Or on destination: scp -r username@192.168.1.10:/var/www/htdocs . It will ask for a password, to prevent that, you need to set up key based authenticaton on ssh so you can log in without a password. Maybe look at http://ssh.com/support/documentation...ication-2.html (I googled "ssh public key") When you have your command working, you can put it in cron. |
Look into the scp command. For instance:
scp some_source_file username@machine:/path_to_destination or for a directory: scp -r some_file username@machine:/path_to_destination This will also work from the other direction scp username@machine:/path_to_source_file /destination_path and if you setup a public key with ssh you won't have to enter a password. |
Ah...you beat me...heh
|
You might want to check out using rsync as well. It works over SSH quite nicely and may have some advantages over scp depending on what you want to do.
|
An example of rsync with ssh
rsync -av some_files -e ssh user@host:/path_to_dist
-a will archive directory including subdirectories -v verbose optional -e will use ssh for secure transfers Rsync will keep logs of file changes. When you run this command in the future, only new or updated files will be transferred. |
Quote:
|
All times are GMT -5. The time now is 01:18 PM. |