[SOLVED] Bash backup script with scp, multiple processes
ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Hi,i wrote a script for backing up my data over the network with scp in crontab, but i get over 50 processes and the script is only by the first file ,and that make my servers load very high, how can i lock the process ? Or is it better when i use rsync or nfs+copy ?
Code:
REMOTE=1.2.3.4
SOURCE=/data/some/directory
TARGET=/mnt/some/directory
LOG=/root/bck.log
DATE=`date +%y\.%m\.%d\.`
USER=root
ssh $USER@$REMOTE mkdir $TARGET/$DATE
if [ -d "$SOURCE" ]; then
for i in `ls $SOURCE | grep 'data'`;do
echo "Begining copy of" $i >> $LOG
scp $SOURCE/$i $USER@$REMOTE:$TARGET/$DATE
echo $i "completed" >> $LOG
if [ -n `ssh $USER@$REMOTE ls $TARGET/$DATE/$i 2>/dev/null` ];then
rm $SOURCE/$i
echo $i "removed" >> $LOG
echo "####################" >> $LOG
else
echo "Copy not complete" >> $LOG
exit 0
fi
done
else
echo "Directory is not present" >> $LOG
exit 0
fi
Fifty of what procs? That scp command, concurrently? How often is your cronjob set for?
Also, just a tip, if you're trying to copy over all files from $SOURCE with "data" in the filename, try:
Code:
scp $SOURCE/*data* $USER@$REMOTE:$TARGET/$DATE
you could do even better with rsync. Don't create the directory on the other side until you know $SOURCE contains files. You can take care of all of this with rsync's prune empty dirs option and relative path's option.
for i in `ls $SOURCE | grep 'data'`;do
echo "Begining copy of" $i >> $LOG
scp $SOURCE/$i $USER@$REMOTE:$TARGET/$DATE
echo $i "completed" >> $LOG
if [ -n `ssh $USER@$REMOTE ls $TARGET/$DATE/$i 2>/dev/null` ];
Let's analyse this portion of your code. The for loop can be written,
Code:
for i in *data*
using shell expansion. There is no need to use ls and grep. These cost you extra processes.
Next, for every data file found (say for eg 500 of them), you call scp. that means calling scp 500 times.
Next, you also did a ssh, this will also be called 500 times using the for loop.
Let's analyse this portion of your code. The for loop can be written
...
Thank's for the example, i tought that the script is going sequentially so
the first scp begin, and the script goes forward when scp finished with copy, my fault.But would it be a solution when i use "&&" ?
Code:
scp $SOURCE/$i $USER@$REMOTE:$TARGET/$DATE &&
Quote:
I agree. Rsync would be the efficient way to accomplish this task.
I try a new script with rsync but originally was the script with rsync and my server cached all the backup data, and the CPU and I/O was on 100%.
That's why i search a sequentially solution.
There're two keys here. One with scp or rsync, you can do file globbing with the command itself; meaning you don't need to loop through all of the files, you can transfer them with a single instance. Also, for convenience, rsync has that option to remove the source file, so you know it will only be removed if the transfer was successful. Read the rsync man page a few times over, it's very powerful.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.