Ansible: The The Known Hosts Module


When a user access a remote node using ssh for the first time, a prompt will appear asking the following:

  • This key is not known by any other names. Are you sure you want to continue connecting (yes/no/[fingerprint])?

The above warning appears since the ssh public key of the remote node is not present in the local host, or there is a mismatch in the information between the nodes. This warning can become an issue especially if the local host is not sure about the remote node’s public key and therefore how to respond to the warning message. Similarly this becomes an issue if the local host is trying to connect to the remote node where ssh is initiated by a script, where the script stops at the warning as a response is needed to proceed forwards.

Here is best way to avoid the warning is to place a copy of the remote nodes public key on the “/etc/ssh/ssh_known_hosts” file in advance. Ansible helps to copy this using the “known_hosts” module.

The below example demonstrates usage of the module where the node “centosMYOBvm” tries to access another node “centOS9test” with IP

Let us try accessing the ansible server when the required information is not stored in “ssh_know_hosts”. As the warning appears, let us exit without continuing.

[root@centosMYOBvm ~]# ssh user01@

===== =====
The authenticity of host ‘ (’ can’t be established.
ED25519 key fingerprint is SHA256:7SHwc8h+HmpeCxRD77QxBZJ6qHisFoSKdT2pfLlwVcc.
This host key is known by the following other names/addresses:
Are you sure you want to continue connecting (yes/no/[fingerprint])? no
Host key verification failed.

Now let us create a playbook in the Ansible server “centos9vm” with IP, and execute the playbook. Here the SSH key of the CentOS9test will be send to all managed nodes.

[root@centos9vm ~]# cat known_hosts.yml

==== ===
– – –
– name: Playbook to add SSK key to managed nodes
      – name: Task to copy SSH key
            path: /etc/ssh/ssh_known_hosts
            key: CentOS9test, ssh-ed25519     

==== ==== =

[root@centos9vm ~]# ansible-navigator run -m stdout known_hosts.yml

===== ==
PLAY [Playbook to add SSK key to managed nodes] ********************************

TASK [Gathering Facts] *********************************************************
ok: []

TASK [Task to copy SSH key] ****************************************************
changed: []

PLAY RECAP ********************************************************************* : ok=2 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0

==== ==

Since the playbook ran successfully let us login to the managed node and see if we can ssh to without getting any warning.

[root@centosMYOBvm ~]# ssh user01@

===== === ==
user01@’s password:

==== ====

An alternate way to copy the SSH key of a remote node to all managed node will be to save the remote node’s public key in a file in the ansible server, and replace the “key” parameters under “ansible.builtin.known_hosts” module in above playbook with follwing line:

key: “{{ lookup(‘ansible.builtin.file’, ‘pubkeys/Centos9test1’) }}”

[root@centos9vm ~]# cat pubkeys/Centos9test1
==== ==== === ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDFr8h1n5cj2Nb2uHSw8J2KZbg00nGZmvhee26eYH+6T
==== ==== ==

The above key is copied from the ssh public key file from

[root@centos9test1 ~]# cat /etc/ssh/
===== ===
ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDFr8h1n5cj2Nb2uHSw8J2KZbg00nGZmvhee26eYH+6T
===== ===