
Automatically populating .ssh/known_hosts - cperciva
http://www.daemonology.net/blog/2012-01-16-automatically-populating-ssh-known-hosts.html
======
mike-cardwell
Alternatively, stick "VerifyHostKeyDNS yes" in your OpenSSH client
configuration, then add SSHFP DNS records to your domain, and configure up
DNSSEC.

That way, OpenSSH will automate fingerprint verification by looking it up in
the DNS.

If you have DNSSEC set up and you run the following command:

    
    
      ssh -o "VerifyHostKeyDNS yes" grepular.com
    

You'll notice that it doesn't ask you to verify the fingerprint.

    
    
      mike@alfa:~$ dig +short sshfp grepular.com
      2 1 D027033D70738EEF8F2D30ED4B0D507C99D35BB1
      1 1 4BDBAB48F0CE98D51FCA81AC7C915B00B20993BF
      mike@alfa:~$

~~~
cperciva
That works well for long-lived hosts, but it doesn't solve the VM problem,
since you need to get the fingerprints into DNS in the first place.

------
xxjaba
If you use a system integration framework like chef or puppet there are some
really neat ways of configuring your known-hosts files during convergence. For
example, the ssh_known_hosts cookbook from opscode will automatically populate
the ssh_known_hosts file with the information for all nodes in a particular
environment (<https://github.com/cookbooks/ssh_known_hosts>). Of course this
is only relevant if you are using one of those tools ...

------
ef4
What I really want is decent PKI for ssh host keys so I can verify a key once,
sign it, and have it respected by all other users/machines without further
intervention.

I may try out MonkeySphere, anybody already use it?

Edit to add: MonkeySphere doesn't support options in authorized_keys, which is
a bummer.

------
jaylevitt
I've been wrestling with this this week. I like the console-output idea, but
doesn't console output take a good 5-10 minutes to show up?

~~~
cperciva
3-4 minutes, but yes, it's rather annoying.

