Linux - SecurityThis forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have a server, call it Host, which communicates to a device under development. That device runs linux and has its own ip address; when it boots, Host uses "ssh-add" to add a provided id_rsa, then it can scp a directory to the device without being prompted for a password. This is great for setting up automated scripts on Host to boot Device, scp executables to Device, and then run tests which synchronize via ssh.
CI <--> Host <--> Device
I have a server running a Continuous Integration service, call it CI, and I'd like to have the CI tool run those automated tests. I've set up rsa keys such that, when I'm logged into CI, I can ssh and scp to Host without being prompted for a password. However, when I'm logged into CI and ssh into Host, and then try to invoke those same scripts which work fine when I've logged directly onto Host, I'm prompted for the password to the device.
If, from CI, I try to invoke Host's script with "ssh Host script.sh", I get "Permission denied, please try again." (twice) and "Permission denied (publickey, password).
I'd like to understand why CI ssh'ing to Host cannot then scp to Device, while Host can scp to Device without problems. Even better, I want to know how to fix it so the continuous integration server can remotely run those scripts which scp to Device.
I figured CI needs Device's rsa key. I tried to scp that provided id_rsa to server CI and add it to CI, but "ssh-add id_rsa" gets "Could not open a connection to your authentication agent."
I don't usually use ssh-agent/ssh-add, you could try 'ssh-copy-id' - this will deploy the keys to the target host. You would run it once from [CI] -> [host], then from [host] -> [device]
- The "device under development" contains two authorized keys - one from the host and a second from the CI? Then it might be necessary to add "-oForwardAgent=yes" to use your local ssh-agent on the CI.
- If you want to use an already running ssh-agent on the host instead, it must be told to the login session to use it by setting the appropriate SSH_AUTH_SOCK of the already running ssh-agent on the host.
- The "device under development" contains two authorized keys - one from the host and a second from the CI? Then it might be necessary to add "-oForwardAgent=yes" to use your local ssh-agent on the CI.
The Device contains one key: its own private key. Host has Device's public key, and the test can do everything it needs to do from a script on Host with that setup.
Quote:
Originally Posted by Reuti
- If you want to use an already running ssh-agent on the host instead, it must be told to the login session to use it by setting the appropriate SSH_AUTH_SOCK of the already running ssh-agent on the host.
Yes, this is what I want to do. The ssh session between Host and Device is left up and running.
I may not be precisely correct, but I believe "eval" says to run through the command line evaluation process for ssh-agent, thus re-evaluating the rsa keys held in the agent on Host, and at the same time adds the key to that agent. The ssh command from CI evaluates that file before executing my test-kickoff script. If you have any clarification to add, please do. Thank you!
I may not be precisely correct, but I believe "eval" says to run through the command line evaluation process for ssh-agent, thus re-evaluating the rsa keys held in the agent on Host, and at the same time adds the key to that agent. The ssh command from CI evaluates that file before executing my test-kickoff script. If you have any clarification to add, please do. Thank you!
eval will interpret the output of ssh-agent as it would have been typed directly on the command line. In the essence it will set two environment variables which will point to the used socket in /tmp and the ssh-agent's pid (just run ssh-agent it without eval and you could copy & paste its output).
Then the key is added by ssh-add to this running agent.
When I look into the script, it might be the case that you will have many ssh-agents running at the same time as they are never stopped again, one for each login. You can check with:
Code:
$ ps -e f | grep agent
If this is the case, we can look into limiting it to one, which is always running.
When I look into the script, it might be the case that you will have many ssh-agents running at the same time as they are never stopped again, one for each login. You can check with:
Code:
$ ps -e f | grep agent
If this is the case, we can look into limiting it to one, which is always running.
You are quite right. How do I limit it to one agent? That sounds cleaner than figuring out which agent to kill at the end.
SSH_ENV=$HOME/.ssh/env-$HOSTNAME
function start_agent()
{
ssh-agent | head -2 > ${SSH_ENV}
chmod 600 ${SSH_ENV}
. ${SSH_ENV}
ssh-add
}
if [ -z "$SSH_AUTH_SOCK" ]; then
if [ -f "$SSH_ENV" ]; then
. ${SSH_ENV}
FOUND_UID=`ps --no-heading -p${SSH_AGENT_PID} -o uid`
if [ ! -S "${SSH_AUTH_SOCK}" -o ${FOUND_UID:-0} -ne $UID ]; then
start_agent
fi
else
start_agent;
fi
fi
When you look it up with Google by some of the variable names used there, there are plenty variations of such a script. This one originated from one of them but I extended it to allow ssh-agents of more than one user on a machine (like on a server). Therefore I check whether there is already an ssh-agent of this particular user.
This script could be saved in ~/.ssh/ssh-login and needs to be sourced during login by adding one line to the ~/.bash_profile, ~/.bash_login or ~/.profile (I don't know which one is used in your distributino):
After implementing what you suggest, I executed my scripts from the continuous integration server on CI and got one ssh-agent; good. I repeated the process and got a second ssh-agent on Host, which isn't an improvement. Then I remembered that the continuous integration server runs a non-interactive non-login shell, so I added "source /home/user/.ssh/ssh-login" to the continuous integration job. It's Hudson, BTW, running on RHEL, and the host is FC 12.
The script failed ("hostname" replaces real host name):
I tried `ps --no-heading -p7184 -o uid` (using a PID of an active child process) directly on the command line while VNC'd to Host and got nothing. I can't find "--no-heading" in the man page for ps. Then, I tried `ps -pPID -o uid --no-header` and that seemed to work. I then modified ssh-login accordingly, but it still fails:
--no-headers print no header line at all. --no-heading is an alias for this option.
Maybe it's a Debian extension, but on openSUSE 11.3 it's also working, despite the fact that it's not in the man page in this case. Anyhow, you solved it.
So, we have the issue that the ssh-agent from the last login is still running and the file with its settings is also there, but it's not recognized due to the output of the ps command in the script returns nothing, like on the command line. This would imply that the ssh-agent isn't running any longer and so a new one is started.
Can you check, what PIDs the still running ssh-agent have in ps, as you mention they are still running. The one recorded in /home/hudson/.ssh/env-hostname is not among them?
Can you check, what PIDs the still running ssh-agent have in ps, as you mention they are still running. The one recorded in /home/hudson/.ssh/env-hostname is not among them?
env-$HOSTNAME has this:
SSH_AUTH_SOCK=/tmp/ssh-lOyBVF7182/agent.7182; export SSH_AUTH_SOCK;
SSH_AGENT_PID=7184; export SSH_AGENT_PID;
That lines up with what your script reports.
If I want to stop now, I could invoke the Host scripts from the CI tool *without* sourcing myfile or .bash_profile, and it will work without leaving a zombie process. Until the system reboots, right? So that's not a good option. Another option could be for me to put "killall -9 ssh-agent" at the end of my Hudson job.
What is not a god option? That you have to enter the passphrase once after a reboot? Or that it's running all the time?
The first; I would prefer to be able to run these tests non-interactively at any time, even after a reboot.
When I use "ps -pPID -o uid --no-header" on the ssh-agent's PID, at the command line, it gives me my correct UID. But doesn't the result I posted about at 4:10 imply when that same line is used inside your script, nothing is returned? ("FOUND_UID= ")
When you just want to connect to the device, and don't want to use the ssh-key for other purpose, you could also remove the passphrase by using
Code:
$ ssh-keygen -p -f ~/.ssh/id_rsa
As the passphrase protects only the private part of the key, nothing needs to be changed on the device under test.
You are right, that it should also output something useful inside the script. Maybe the script you use is absorbing any output for any reason (you mentioned that you put it inside a script). An option could be to put it in ~/.bashrc which should be used for a non-interactive backup.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.