Ansible: The The Known Hosts Module

Ansible_Logo

When a user access a remote node using ssh for the first time, a prompt will appear asking the following:

  • This key is not known by any other names. Are you sure you want to continue connecting (yes/no/[fingerprint])?

The above warning appears since the ssh public key of the remote node is not present in the local host, or there is a mismatch in the information between the nodes. This warning can become an issue especially if the local host is not sure about the remote node’s public key and therefore how to respond to the warning message. Similarly this becomes an issue if the local host is trying to connect to the remote node where ssh is initiated by a script, where the script stops at the warning as a response is needed to proceed forwards.

Here is best way to avoid the warning is to place a copy of the remote nodes public key on the “/etc/ssh/ssh_known_hosts” file in advance. Ansible helps to copy this using the “known_hosts” module.

The below example demonstrates usage of the module where the node “centosMYOBvm” tries to access another node “centOS9test” with IP 192.168.132.131.

Let us try accessing the ansible server when the required information is not stored in “ssh_know_hosts”. As the warning appears, let us exit without continuing.

[root@centosMYOBvm ~]# ssh user01@192.168.48.131

===== =====
The authenticity of host ‘192.168.48.131 (192.168.48.131)’ can’t be established.
ED25519 key fingerprint is SHA256:7SHwc8h+HmpeCxRD77QxBZJ6qHisFoSKdT2pfLlwVcc.
This host key is known by the following other names/addresses:
~/.ssh/known_hosts:1: 192.168.132.129
Are you sure you want to continue connecting (yes/no/[fingerprint])? no
Host key verification failed.
=====

Now let us create a playbook in the Ansible server “centos9vm” with IP 192.168.132.128, and execute the playbook. Here the SSH key of the CentOS9test will be send to all managed nodes.

[root@centos9vm ~]# cat known_hosts.yml

==== ===
– – –
– name: Playbook to add SSK key to managed nodes
   hosts: 192.168.48.129
   tasks:
      – name: Task to copy SSH key
         ansible.builtin.known_hosts:
            name: 192.168.48.131
            path: /etc/ssh/ssh_known_hosts
            key: CentOS9test,192.168.48.131 ssh-ed25519     
AAAAC3NzaC1lZDI1NTE5AAAAIIfJHPAw4d9YwMDGBxO2J93mYfakFIZo+BDlWDooOxyq

==== ==== =

[root@centos9vm ~]# ansible-navigator run -m stdout known_hosts.yml

===== ==
PLAY [Playbook to add SSK key to managed nodes] ********************************

TASK [Gathering Facts] *********************************************************
ok: [192.168.48.129]

TASK [Task to copy SSH key] ****************************************************
changed: [192.168.48.129]

PLAY RECAP *********************************************************************
192.168.48.129 : ok=2 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0

==== ==

Since the playbook ran successfully let us login to the managed node and see if we can ssh to 192.168.48.131 without getting any warning.

[root@centosMYOBvm ~]# ssh user01@192.168.48.131

===== === ==
user01@192.168.48.131’s password:

==== ====

An alternate way to copy the SSH key of a remote node to all managed node will be to save the remote node’s public key in a file in the ansible server, and replace the “key” parameters under “ansible.builtin.known_hosts” module in above playbook with follwing line:

key: “{{ lookup(‘ansible.builtin.file’, ‘pubkeys/Centos9test1’) }}”

[root@centos9vm ~]# cat pubkeys/Centos9test1
==== ==== ===
192.168.48.132 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDFr8h1n5cj2Nb2uHSw8J2KZbg00nGZmvhee26eYH+6T
==== ==== ==

The above key is copied from the ssh public key file from 192.168.48.132

[root@centos9test1 ~]# cat /etc/ssh/ssh_host_ed25519_key.pub
===== ===
ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDFr8h1n5cj2Nb2uHSw8J2KZbg00nGZmvhee26eYH+6T
===== ===