sftp using batch and public key doing 'put' launching from crontab
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
sftp using batch and public key doing 'put' launching from crontab
Hello! I'm kinda new to *nix automation and have a task that I'm not sure about. The OS is CentOS 7 on 'A' and Ubuntu on 'B'. SFTP is the only method allowed on 'B'.
The task: I would like to sftp a file from server 'A' to server 'B' using 'batchfile', starting from a cronjob. It will also need to email back if there is a failure for some reason.
This is what I have so far. It is not tested as I am not confident I have it right.
#Begin in proper directory
cd /opt/directory
#Connect to server 'B'
sftp -b /home/user/batchfile_commands -oPort=port# -o IdentityFile=~/.ssh/id_rsa_xfer username@serverB.com
batchfile_commands file
Code:
progress
cd uploads
put filename
bye
I'm not sure how I would email results or have a log sent to an email address?
Any help is appreciated. I looked at a few forum posts here and there, and this one in particular, but I am not sure how to translate the examples to my scenario.
mmm, if sftp has a non 0 exit status maybe? If that works then you could do something like.
Code:
sftp -b /home/user/batchfile_commands -oPort=port# -o IdentityFile=~/.ssh/id_rsa_xfer username@serverB.com >> /some/log/file
if [[ $? != 0 ]]; then
mail -s "SFTP Error report" -r"this.server@hostname.tld" -a "/some/log/file" "to@some@account"
fi
you'd need to test and play yourself, this is just a rough and some of the settings here are just place holders.
Should have said this earlier, but I am assuming "port#" is changed to an actual number, as the "#" symbol may act as a comment where the shell would ignore anything after it.
Last edited by r3sistance; 05-15-2017 at 09:44 AM.
I think I am making progress. The job fires off, does its log in thing, uploads a file then exits. As expected, if everything goes well, I get no notice.
The problem I have now - apparently the third-party will kill the ability to connect via key after a certain amount of time? 24hrs after uploading the public key to the typical 'authorized_keys' file in .ssh/ it will fail to allow me to connect. I have to upload the same pub key again. I'm using 24hrs as a guide since ight after I upload the key, everything works. I come in the next day, with no changes I am aware of, and it won't.
As a result, I need to now also pass potential login failures to an email from this cronjob.
I noticed there is output in /var/spool/mail/useraccount but I don't need all the info from that file. Only what pops in showing the login fail ergo something like tail -n 25 /var/spool/mail/useraccount. I feel like there is something I can add to the conditional to get this? Just not sure what.
Just found out a minor detail. The third-party is apparently scrubbing the authorized_keys file. Discovered after I decided to get the remote file and vim it:
# Generated by Chef for remotesite.com
# Local modifications will be overwritten.
Just found out a minor detail. The third-party is apparently scrubbing the authorized_keys file. Discovered after I decided to get the remote file and vim it:
# Generated by Chef for remotesite.com
# Local modifications will be overwritten.
sad face.
I have hosts on my grid that do that. Not from Chef or anything. Strictly Closed source stuff.
We utilize authorized_keys2
heh! Turns out, after a little arm-twisting, I managed to get the public key appended to the remote authorized_keys file by the third-party. But I am definitely keeping your tid bit in my back pocket, so thanks for that! So far so good.
Next is to work in something that will email me upon connectivity failure ( since I have the conditional for the sftp actions failing - thanks to r3sistance for that bit to work with ).
Here is what I have so far, but it only shows output from the transactions in the batch file, not the actual failure of connecting via sftp:
Code:
# Begin in export directory
cd /opt/dir
# Connect to remote upload
sftp -b /home/user/commands -oPort=someport# -o IdentityFile=~/.ssh/id_rsa_xfer user@domain.com >> /tmp/log.txt
if [[ $? != 0 ]] ; then
mail -s "Error Report" -r "account@domain" -q "/tmp/log.txt" "address@domain.com"
fi
If the sftp connection fails, I just get a blank email - I have to view /var/spool/mail/useraccount to see why. This is what I need to have sent in the email as well. Just not sure how to put that in the code.
Maybe this?:
Code:
if [[ $? != 0 ]]; then
echo "cat /tmp/log.txt; tail -n 25 /var/spool/mail/useraccount" | mail -s "Error Report" -r "account@domain" "recipientaddr@domain.com"
fi
EDIT: nevermind about the above code. That just puts the cat commands in an email. DOH!
EDIT2: more progress. Found out that I had to move extra returns in the output file to get mail to send it in a viewable format.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.