Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place! |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
09-17-2008, 03:26 PM
|
#1
|
LQ Newbie
Registered: Nov 2007
Posts: 3
Rep:
|
SCP files from one server to another
Hi all,
I need to scp files (DB2 backup files) from one server (A) to another server (B) and I would like to automate this. What i need to do is to find the files older than 11 days and scp those files to the other server. And then i need to delete the files on server (A) that were sent over. Any help would be greatly appreciated. So far i have the this command:
find /data/db2inst1/backup -name "*.0.db2inst1.NODE0000.CATN0000.*" -mtime +11 -type f -exec scp {} <server(B)>:<server(B) directory> \; -print
So this will find the files 11 days or older and scp them over to server (B). Please let me know if this is correct or if there's another way to do it. Thank you.
|
|
|
09-17-2008, 03:36 PM
|
#2
|
LQ Guru
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 27,716
|
Quote:
Originally Posted by rev1976
Hi all,
I need to scp files (DB2 backup files) from one server (A) to another server (B) and I would like to automate this. What i need to do is to find the files older than 11 days and scp those files to the other server. And then i need to delete the files on server (A) that were sent over. Any help would be greatly appreciated. So far i have the this command:
find /data/db2inst1/backup -name "*.0.db2inst1.NODE0000.CATN0000.*" -mtime +11 -type f -exec scp {} <server(B)>:<server(B) directory> \; -print
So this will find the files 11 days or older and scp them over to server (B). Please let me know if this is correct or if there's another way to do it. Thank you.
|
I can't test it right now, but it looks good to me.
However, anytime file deletions are automated, I get nervous. Since I've been snakebit by this before ("What? You deleted them? Yeah, I know I said you could, but I really need it NOW!"), I'd probably do a move on the file to another drive/partition, where they could sit unnoticed. If someone complained or needed one, I'd have it quickly available. If not, I'd probably go out there every now and then, and nuke things older than a month or so....
|
|
|
09-17-2008, 07:45 PM
|
#3
|
LQ Guru
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.x
Posts: 18,441
|
Actually, if he's got a copy on another server I wouldn't worry too much. I would gzip or bzip2 the files unless they are being actively used. Saves on disk space.
I'd also tend to do it in a loop rather than as a one-liner, as you can then add other cmds easily eg email conf list of which files got copied, or email error msg if failure...
This sounds like its going to end up as a cron job.
Last edited by chrism01; 09-18-2008 at 08:20 PM.
|
|
|
09-18-2008, 08:13 AM
|
#4
|
LQ Guru
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 27,716
|
Quote:
Originally Posted by chrism01
Actually, if he's got a copy on another server I wouldn't worry too much. I would gzip or bzip2 the files unless they are being actively used. Saves on disk space.
|
Agreed on the compression, but I *ALWAYS* worry about deleting a file. I've had them get corrupted, or have some goober delete them from the backup location. If I have the space, I make more copies. 
|
|
|
09-18-2008, 08:21 PM
|
#5
|
LQ Guru
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.x
Posts: 18,441
|
In the long run offline backups are de-rigeur, pref off-site.

|
|
|
All times are GMT -5. The time now is 06:12 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|