SSH RSA key fingerprint with network
Hi,
I have a network with several clients running sshd. I have different port pointing to each client to port 22. I was just wondering if there is any way to cope with the hazzle of rsa key fingerprint in this situation. Whenever I from the outside ssh to a different machine within the network I have to manually remove ~/.ssh/known_hosts before sshing to the client behind the firewall. Any ideas? |
I'm not sure I understand the dilemma, unless you're using
different users with the same IP address? Just can't seem to get my head around this problem. Perhaps you can give an example? If it's a user issue, you can use "ssh -p 22 username@IP" where username is the user on the target machine that you ssh into. |
It's a nated network which I am trying to reach from outside. I have several clients running sshd. So configed the firewall to redirect different ports to different clients (one including the box itself). So since I am sshing to the same ip no matter which computer I am trying to reach I get different fingerprints. I hoped that there would be a way around this. Like make fingerprint ip/port-specific instead of just ip-specific.
|
why not push differnet ports for each sshd through your firewall.
|
Yes, that's what I've done. I have several ports open from outside on the firewall, pointing to port 22 on different clients. But here's the problem.
Client A can be reached from outside the network on port 1234 Client B can be reached from outside the network on port 4321 I ssh xx.xx.xx -p 1234 and it creates a fingerprint. Next time I ssh xx.xx.xx.xx -p 4321 and now the fingerprint is wrong. So I have to rm ~/.ssh/known_hosts and run ssh xx.xx.xx.xx -p 4321 again and create a new fingerprint for client B. You see the hazzle? The fingerprint does not care about me using different ports, it is the same IP address and therefore the fingerprint is wrong. I would like to have something like this: xx.xx.xx.xx:1234 uses fingerprint A xx.xx.xx.xx:4321 uses fingerprint B Now whenever I ssh to either client a or b it uses the correct fingerprint for that client and I don't have to manually remove known_hosts and recreate the fingerprint each time I ssh to a different client. |
I'm currently seeking this solution also, as I have many SSH daemons in a NAT'd LAN (running under one public IP address), in which I would like to access independent of a central daemon (SSH'ing to one, then hopping to another, to avoid the known_hosts issue).
|
Either:
(1) only ssh into one machine, then ssh from that machine to your desired target elsewhere in the LAN or: (2) set up different user IDs on your computer and su to the appropriate user ID in order to ssh into the appropriate box. For instance, if the desired target computer is named BigBox, then set up a userID BigBoxID on your machine and from a shell window su BigBoxID in order to ssh into BigBox. This is how I do it. |
You don't even have to su to BigBox, if you do like I
posted above ^^^ and add "username@IP". In that example you ssh into port whatever as username@IP. I do this when working on client networks remotedly and I need something off my server. It allows me to ssh into a non-standard port (the only one open in my router) to my server. If you have the username on the machine to which you login, it should not require a new key each time. Since the ssh key is created per username on host, this seems to be a solution. |
What if I have the same nickname on all different machines within the LAN?
So: (cmds issued from 81.3.2.1 which is outside the NATed network where 217.1.1.1 acts as the gateway) ephracis $ ssh ephracis@217.1.1.1 (normal) ephracis $ ssh ephracis@217.1.1.1 -p 1002 -> ephracis@192.168.0.2:22 ephracis $ ssh ephracis@217.1.1.1 -p 1003 -> ephracis@192.168.0.3:22 ephracis $ ssh ephracis@217.1.1.1 -p 1004 -> ephracis@192.168.0.4:22 ephracis $ ssh ephracis@217.1.1.1 -p 1005 -> ephracis@192.168.0.5:22 Would they "share" the same in known_hosts and therefore requirer me to remove .ssh/known_hosts everytime I need to access a different NATed computer on the LAN? I understand that using a different local user to ssh into each machine would create a different ~/.ssh/known_hosts and work, it seems like a waste of space to use a user just for that. Can't I somehow separate the fingerprints for each port on 217.1.1.1? Or doesn't OpenSSH support that? If so, is there a good reason for that? |
It is my understanding that the fingerprint is created for the user. If that's so, you can login to any port with that username and use the same key. Why don't you just try it?
NB: You must have ephracis on each separate machine. |
Quote:
But since I have several different machines that I ssh into (which are located inside the NATed network) I have several different fingerprints. How do these fingerprints work and how are they stored in known_hosts? Because as it is now these different fingerprints overwrite each other in known_hosts. They can't live side-by-side (not if they share the same ip, and that's exactly what machines behind a NATed gateway do). So since all my machines share the same IP they share the same entry in known_hosts, giving me headache. Is there a way to distinguish them and make each fingerprint (for each machine within the NATed network) exist at the same time, in known_hosts. And then make ssh use the right fingerprint for the right machine (they are still all accessed via the same public ip from my machine, outside the network). Is this possible or even possible to implement? I am no expert in the security that known_hosts creates but it sure gives me problems when I have several fingerprints for one IP. |
I think what coolb posted way back might be the answer. For my LAN I have port forwarding setup to specific boxen -- I think port forwarding is what you want.
From the following output you supplied: Code:
(cmds issued from 81.3.2.1 which is outside the NATed network where 217.1.1.1 acts as the gateway) forward ssh on port 1002 to 192.168.0.2 forward ssh on port 1003 to 192.168.0.3 forward ssh on port 1004 to 192.168.0.4 forward ssh on port 1005 to 192.168.0.5 Then you would login as I wrote before, i.e.: "ssh -p 1002 ephracis@217.1.1.1" which you realize means the user ephracis logs in to 217.1.1.1 to port 1002, which forwards to the computer on your LAN at 192.168.0.2. And have you read this from "man ssh" ? Code:
-L [bind_address:]port:host:hostport |
|
Me wonders how long it will be before yourdomain.com starts getting spam. Those bots are going to find that. :eek:
:D :D :D :D :D |
a solution
Quote:
To make this attack harder, ssh stores the fingerprint of the server's public key on thefirst connection attempt. Yopu will see a prompt like: Code:
The authenticity of host 'eisen (137.43.366.64)' can't be established. Code:
Warning: Permanently added 'eisen,137.43.366.64' (RSA) to the list of known hosts. However, I found a solution: If there are several different fingerprints in known_hosts for the same host (IP), ssh will connect if at least one of them is correct. So what you should do is Code:
# 1.) move your known_hosts file to a different filename The above approach worked with my ssh (Version OpenSSH_3.8.1p1 Debian-8.sarge.4, OpenSSL 0.9.7e 25 Oct 2004). I hope it works for you also. Regards, Lotharster |
All times are GMT -5. The time now is 07:31 PM. |