Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place! |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
05-12-2017, 09:09 AM
|
#1
|
Member
Registered: May 2017
Distribution: CentOS
Posts: 41
Rep:
|
sftp using batch and public key doing 'put' launching from crontab
Hello! I'm kinda new to *nix automation and have a task that I'm not sure about. The OS is CentOS 7 on 'A' and Ubuntu on 'B'. SFTP is the only method allowed on 'B'.
The task: I would like to sftp a file from server 'A' to server 'B' using 'batchfile', starting from a cronjob. It will also need to email back if there is a failure for some reason.
This is what I have so far. It is not tested as I am not confident I have it right.
Code:
#originating user's crontab
0 23 0 0 0 /home/user/run_upload
run_upload file
Code:
#Begin in proper directory
cd /opt/directory
#Connect to server 'B'
sftp -b /home/user/batchfile_commands -oPort=port# -o IdentityFile=~/.ssh/id_rsa_xfer username@serverB.com
batchfile_commands file
Code:
progress
cd uploads
put filename
bye
I'm not sure how I would email results or have a log sent to an email address?
Any help is appreciated. I looked at a few forum posts here and there, and this one in particular, but I am not sure how to translate the examples to my scenario.
Thanks!
|
|
|
05-12-2017, 10:13 AM
|
#2
|
Senior Member
Registered: Mar 2004
Location: UK
Distribution: CentOS 6/7
Posts: 1,375
|
CentOS already has a log file for cron which is /var/log/cron, it isn't very verbose tho, it'll just give you the end result.
What you maybe after is called 'I/O redirection'
Code:
# uptime >> /root/somefile.log
# cat /root/somefile.log
16:12:15 up 14 days, 22:10, 1 user, load average: 0.00, 0.01, 0.05
# uptime >> /root/somefile.log
# cat /root/somefile.log
16:12:15 up 14 days, 22:10, 1 user, load average: 0.00, 0.01, 0.05
16:12:29 up 14 days, 22:11, 1 user, load average: 0.00, 0.01, 0.05
Last edited by r3sistance; 05-12-2017 at 10:27 AM.
|
|
1 members found this post helpful.
|
05-15-2017, 08:29 AM
|
#3
|
Member
Registered: May 2017
Distribution: CentOS
Posts: 41
Original Poster
Rep:
|
Thanks for the response, r3sistance. The redirection might be helpful.
So, if I want to email results to an email address, how would I apply this redirection in the script but only if there is a failure?
|
|
|
05-15-2017, 08:41 AM
|
#4
|
Senior Member
Registered: Mar 2004
Location: UK
Distribution: CentOS 6/7
Posts: 1,375
|
mmm, if sftp has a non 0 exit status maybe? If that works then you could do something like.
Code:
sftp -b /home/user/batchfile_commands -oPort=port# -o IdentityFile=~/.ssh/id_rsa_xfer username@serverB.com >> /some/log/file
if [[ $? != 0 ]]; then
mail -s "SFTP Error report" -r"this.server@hostname.tld" -a "/some/log/file" "to@some@account"
fi
you'd need to test and play yourself, this is just a rough and some of the settings here are just place holders.
Should have said this earlier, but I am assuming "port#" is changed to an actual number, as the "#" symbol may act as a comment where the shell would ignore anything after it.
Last edited by r3sistance; 05-15-2017 at 08:44 AM.
|
|
1 members found this post helpful.
|
05-15-2017, 08:45 AM
|
#5
|
Member
Registered: May 2017
Distribution: CentOS
Posts: 41
Original Poster
Rep:
|
Thanks! I will mess around with this.
Much appreciated!
Yes, you are correct that the bit 'port#' is just a stand-in for whatever the actual port number is. ;-)
Last edited by skagnola; 05-15-2017 at 08:47 AM.
Reason: mention the port
|
|
|
05-16-2017, 02:22 PM
|
#6
|
Member
Registered: May 2017
Distribution: CentOS
Posts: 41
Original Poster
Rep:
|
I think I am making progress. The job fires off, does its log in thing, uploads a file then exits. As expected, if everything goes well, I get no notice.
The problem I have now - apparently the third-party will kill the ability to connect via key after a certain amount of time? 24hrs after uploading the public key to the typical 'authorized_keys' file in .ssh/ it will fail to allow me to connect. I have to upload the same pub key again. I'm using 24hrs as a guide since ight after I upload the key, everything works. I come in the next day, with no changes I am aware of, and it won't.
As a result, I need to now also pass potential login failures to an email from this cronjob.
I noticed there is output in /var/spool/mail/useraccount but I don't need all the info from that file. Only what pops in showing the login fail ergo something like tail -n 25 /var/spool/mail/useraccount. I feel like there is something I can add to the conditional to get this? Just not sure what.
|
|
|
05-16-2017, 02:56 PM
|
#7
|
Member
Registered: May 2017
Distribution: CentOS
Posts: 41
Original Poster
Rep:
|
Just found out a minor detail. The third-party is apparently scrubbing the authorized_keys file. Discovered after I decided to get the remote file and vim it:
# Generated by Chef for remotesite.com
# Local modifications will be overwritten.
sad face.
|
|
|
05-16-2017, 04:27 PM
|
#8
|
LQ Veteran
Registered: Jan 2011
Location: Abingdon, VA
Distribution: Catalina
Posts: 9,374
Rep:
|
Quote:
Originally Posted by skagnola
Just found out a minor detail. The third-party is apparently scrubbing the authorized_keys file. Discovered after I decided to get the remote file and vim it:
# Generated by Chef for remotesite.com
# Local modifications will be overwritten.
sad face.
|
I have hosts on my grid that do that. Not from Chef or anything. Strictly Closed source stuff.
We utilize authorized_keys2
Might be worth a try?
|
|
1 members found this post helpful.
|
05-17-2017, 08:39 AM
|
#9
|
Member
Registered: May 2017
Distribution: CentOS
Posts: 41
Original Poster
Rep:
|
Quote:
Originally Posted by Habitual
I have hosts on my grid that do that. Not from Chef or anything. Strictly Closed source stuff.
We utilize authorized_keys2
Might be worth a try?
|
Thanks, Habitual! I actually generated a pair of those and gave it a shot. It lets me log in; hopefully the actual .ssh/ dir doesn't get scrubbed.
Now... we wait...
|
|
|
05-17-2017, 09:44 AM
|
#10
|
LQ Veteran
Registered: Jan 2011
Location: Abingdon, VA
Distribution: Catalina
Posts: 9,374
Rep:
|
Thank me if/when it works
|
|
|
05-18-2017, 10:29 AM
|
#11
|
Member
Registered: May 2017
Distribution: CentOS
Posts: 41
Original Poster
Rep:
|
Quote:
Originally Posted by Habitual
Thank me if/when it works
|
heh! Turns out, after a little arm-twisting, I managed to get the public key appended to the remote authorized_keys file by the third-party. But I am definitely keeping your tid bit in my back pocket, so thanks for that! So far so good.
Next is to work in something that will email me upon connectivity failure ( since I have the conditional for the sftp actions failing - thanks to r3sistance for that bit to work with ).
Here is what I have so far, but it only shows output from the transactions in the batch file, not the actual failure of connecting via sftp:
Code:
# Begin in export directory
cd /opt/dir
# Connect to remote upload
sftp -b /home/user/commands -oPort=someport# -o IdentityFile=~/.ssh/id_rsa_xfer user@domain.com >> /tmp/log.txt
if [[ $? != 0 ]] ; then
mail -s "Error Report" -r "account@domain" -q "/tmp/log.txt" "address@domain.com"
fi
If the sftp connection fails, I just get a blank email - I have to view /var/spool/mail/useraccount to see why. This is what I need to have sent in the email as well. Just not sure how to put that in the code.
Maybe this?:
Code:
if [[ $? != 0 ]]; then
echo "cat /tmp/log.txt; tail -n 25 /var/spool/mail/useraccount" | mail -s "Error Report" -r "account@domain" "recipientaddr@domain.com"
fi
EDIT: nevermind about the above code. That just puts the cat commands in an email. DOH!
EDIT2: more progress. Found out that I had to move extra returns in the output file to get mail to send it in a viewable format.
Last edited by skagnola; 05-18-2017 at 01:03 PM.
|
|
|
05-18-2017, 01:12 PM
|
#12
|
Senior Member
Registered: Mar 2004
Location: UK
Distribution: CentOS 6/7
Posts: 1,375
|
Gunna guess you are after something like...
Code:
echo $(cat /tmp/log.txt; tail -n 25 /var/spool/mail/useraccount) | mail -s "Error Report" -r "account@domain" "recipientaddr@domain.com"
Where $( ) would be 'Command Substitution'
Last edited by r3sistance; 05-18-2017 at 01:14 PM.
|
|
|
05-18-2017, 01:39 PM
|
#13
|
LQ Guru
Registered: May 2005
Location: boston, usa
Distribution: fedora-35
Posts: 5,326
|
methinks scp would be funner in this instance.
|
|
|
05-18-2017, 06:02 PM
|
#14
|
Senior Member
Registered: Mar 2004
Location: UK
Distribution: CentOS 6/7
Posts: 1,375
|
Quote:
Originally Posted by schneidz
methinks scp would be funner in this instance.
|
Why is using something effectively deprecated and that hasn't had a decent update in like a decade, funner?
|
|
|
All times are GMT -5. The time now is 05:05 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|