My goal is to execute a shell command on different hosts with credentials given directly from the playbook. Until now i have tried two things.
Attempt one:
- name: test
shell:
command: "whoami"
with_items: "{{lookup('file', '../files/deviceList.txt').splitlines()}}"
delegate_to: "{{item.split(';')[0]}}"
args:
ansible_connection: network_cli
ansible_network_os: ios
ansible_user: "{{ cred_ios_r_user }}"
ansible_password: "{{ cred_ios_r_pass }}"
Attempt 2:
- name: set default credentials
set_fact:
ansible_connection: network_cli
ansible_network_os: ios
ansible_user: "{{ cred_ios_r_user }}"
ansible_password: "{{ cred_ios_r_pass }}"
with_items: "{{lookup('file', '../files/deviceList.txt').splitlines()}}"
delegate_to: "{{item.split(';')[0]}}"
- name: test
shell:
command: "whoami"
with_items: "{{lookup('file', '../files/deviceList.txt').splitlines()}}"
delegate_to: "{{item.split(';')[0]}}"
The Username in the variable {{cred_ios_r_user}} is 'User1'. But when i look in the Output from Ansible, it tells me it used the default ssh user named "SSH_User".
What do I need to change that Ansible takes the given credentials?
CodePudding user response:
first thing you should to check - its "remote_user" attribute of module:
- name: DigitalOcean | Disallow root SSH access
remote_user: root
lineinfile: dest=/etc/ssh/sshd_config
regexp="^PermitRootLogin"
line="PermitRootLogin no"
state=present
notify: Restart ssh
Next thing, its ssh_keys. By default, your connections have your own private key, but you can override it by the extraargs of ansible-playbook command, of variables in inventory-file or job-vars:
vars:
ansible_ssh_user: "root"
ansible_ssh_private_key_file: "{{ role_path }}/files/ansible.key"
so if you want, you can set custom keys you have in vars or files:
- name: DigitalOcean | Add Pub key
remote_user: root
authorized_key:
user: "{{ do_user }}"
key: "{{ lookup('file', do_key_public) }}"
state: present
you have take more information in my auto-droplet-digitalocean-role
CodePudding user response:
Your vars should be... vars available for your task, not argument passed to the shell module. So in a nutshell:
- name: test
vars:
ansible_connection: network_cli
ansible_network_os: ios
ansible_user: "{{ cred_ios_r_user }}"
ansible_password: "{{ cred_ios_r_pass }}"
shell:
command: "whoami"
with_items: "{{ lookup('file', '../files/deviceList.txt').splitlines() }}"
delegate_to: "{{ item.split(';')[0] }}"
Meanwhile this looks like a bad practice. You would normally use add_host
to build an in-memory inventory with the correct hosts and credentials and then run your task using the natural host batch play loop. Something like
- name: manage hosts
hosts: localhost
gather_facts: false
tasks:
- name: add hosts to inventory
add_host:
name: "{{ item.split(';')[0] }}"
groups:
- my_custom_group
ansible_connection: network_cli
ansible_network_os: ios
ansible_user: "{{ cred_ios_r_user }}"
ansible_password: "{{ cred_ios_r_pass }}"
with_items: "{{ lookup('file', '../files/deviceList.txt').splitlines() }}"
- name: run my command on all added hosts
hosts: my_custom_group
gather_facts: false
tasks:
- name: test command
shell: whoami
Alternativelly, you can create a full inventory from scratch from your csv file and use it directly or use/develop an inventory plugin that will read your csv file directly (like in this example)