Distributing SSH host keys for password-less login
Linux - SecurityThis forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Distributing SSH host keys for password-less login
Hello all,
I've generated SSH keys for one of my server (server A), and distributed the public key to the nodes in my network.
For password-less login to work, the nodes' host key must be saved on server A. As of now I have to log into each of the nodes from server A so that server A saves the node's host key.
I there a way to get around this? I'm thinking maybe I could generate a common host key for all of my nodes and just add this host key to server A, but I'm not sure if this would work.
Generate keys without passwords on the clients you want to be able to acces the server from. Then add all their id.rsa.pub data to a single text file and name it authorized_keys. Put this in the server's /.ssh folder. Permissions are not an issue for this file.
Putting the ids together make sure there's no spaces between entries, and that there is a blank line (carriage return aka "enter") at the end of the file. e.g:
I've generated SSH keys for one of my server (server A), and distributed the public key to the nodes in my network.
For password-less login to work, the nodes' host key must be saved on server A. As of now I have to log into each of the nodes from server A so that server A saves the node's host key.
I there a way to get around this? I'm thinking maybe I could generate a common host key for all of my nodes and just add this host key to server A, but I'm not sure if this would work.
Please excuse my ignorance, but wouldn't this just install the node's public key on server A, making them able to log into server A? How would this solve the problem I'm having on server A where I have to manually accept the nodes' host key before password-less login will work?
I thought your goal was to have the "nodes" able to log in to the server. Is this incorrect? If that IS the goal, all you need are the public keys in each other's authorized_keys file.
If the keys are shared in both directions the server of course would be able to log in to the "nodes," clients or what have you. I have never seen any need for any other than this. Of course you need to make the keys without passwords, otherwise you'd have to use an agent, or type the passwords in all the time.
Forgive me if I misunderstand your goal. Please elaborate, describe what you want to be able to do...
I believe he wants the server to log into the nodes. Setting StrictHostKeyChecking no in /etc/ssh/ssh_config would result in keys automatically being accepted. This will, however, leave you slightly vulnerable to a MITM attack as SSH will still connect even if the host key is wrong, so I would only set it long enough to connect to each host once (so the keys get saved).
I believe he wants the server to log into the nodes. Setting StrictHostKeyChecking no in /etc/ssh/ssh_config would result in keys automatically being accepted. This will, however, leave you slightly vulnerable to a MITM attack as SSH will still connect even if the host key is wrong, so I would only set it long enough to connect to each host once (so the keys get saved).
This is correct - I want the server to log into the nodes.
catworld: I'm sorry that I didn't make this more clear in my previous posts.
Anyway, setting the StrictHostKeyChecking to "no" just for as long as it takes for the server to log into all the nodes would solve my problem, but since it's vulnerable to a MITM attack it may not be the best solution (I'd rather manually log into the servers first to save the host key).
Are there other alternatives maybe? I read somewhere that it is possible to install multiple host keys on a Linux box, so maybe it would be possible to generate one single host key and distribute it to all the nodes, and then install the key on the server?
You probably could generate one set of SSH host keys, distribute them to all clients, and then generate a large known_hosts file by copying the fingerprint for each hostname/ip.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.