I have been trying to update the values in a dictionary in destination json with the values in the dictionary in source JSON file. Below is the example of source and destination JSON file:
Source file:
[
{
"key": "MYSQL",
"value": "456"
},
{
"key": "RDS",
"value": "123"
}
]
Destination File:
[
{
"key": "MYSQL",
"value": "100"
},
{
"key": "RDS",
"value": "111"
}
{
"key": "DB1",
"value": "TestDB"
},
{
"key": "OS",
"value": "EX1"
}
]
Expectation in destination file after running Ansible playbook:
[
{
"key": "MYSQL",
"value": "**456**"
},
{
"key": "RDS",
"value": "**123**"
}
{
"key": "DB1",
"value": "TestDB"
},
{
"key": "OS",
"value": "EX1"
}
]
Below is the playbook I have tried so far, but this only updates the value if it is hard coded:
- hosts: localhost
tasks:
- name: Parse JSON
shell: cat Source.json
register: result
- name: Save json data to a variable
set_fact:
jsondata: "{{result.stdout | from_json}}"
- name: Get key names
set_fact:
json_key: "{{ jsondata | map(attribute='key') | flatten }}"
- name: Get Values names
set_fact:
json_value: "{{ jsondata | map(attribute='value') | flatten }}"
# Trying to update the destination file with only the values provided in source.json
- name: Replace values in json
replace:
path: Destination.json
regexp: '"{{ item }}": "100"'
replace: '"{{ item }}": "456"'
loop:
- value
The main goal is to update the value
in destination.json with the value
provided in source.json.
CodePudding user response:
Without to knowing the structure of your destination file it's difficult to use a regex.
I suggest you to load your destination file in a variable, do the changes and save the content of variable to file.
This solution does the job:
- hosts: localhost
tasks:
- name: Parse JSON
set_fact:
result: [ { "key": "MYSQL", "value": "456" }, { "key": "RDS", "value": "123" } ]
- name: Parse JSON
set_fact:
json_old: [ { "key": "MYSQL", "value": "100" }, { "key": "RDS", "value": "111" }, { "key": "DB1", "value": "TestDB" }, { "key": "OS", "value": "EX1" } ]
- name: create new json
set_fact:
json_new: "{{ json_new | d([]) ([item] if _rec == [] else [_rec]) | flatten }}"
loop: "{{ json_old }}"
vars:
_rec: "{{ result | selectattr('key', 'equalto', item.key) }}"
- name: display json_new
debug:
msg: "{{ json_new }}"
Result:
ok: [localhost] => {
"msg": [
{
"key": "MYSQL",
"value": "456"
},
{
"key": "RDS",
"value": "123"
},
{
"key": "DB1",
"value": "TestDB"
},
{
"key": "OS",
"value": "EX1"
}
]
}
You could use your_json_feed: "{{ lookup('file', 'path') | from_json }}"
to load a JSON file into variable.
And to save: - copy: content="{{ your_json_feed }}" dest=/path/to/destination/file
.
CodePudding user response:
In Ansible, the couple key
/value
tends to be handled with the filters dict2items
and items2dict
. And your use case can be handled by those filters.
Here would be the logic:
- Read both files
- Convert both files into dictionaries, with
dict2items
- Combine the two dictionaries, with the
combine
filter - Convert the dictionary back into a list with
items2dict
- Dump the result as in JSON back into the file
Given the playbook:
- hosts: localhost
gather_facts: no
tasks:
- shell: cat Source.json
register: source
- shell: cat Destination.json
register: destination
- copy:
content: >-
{{
destination.stdout | from_json | items2dict |
combine(
source.stdout | from_json | items2dict
) | dict2items | to_nice_json
}}
dest: Destination.json
We end up with Destination.json containing:
[{"key": "MYSQL", "value": "456"}, {"key": "RDS", "value": "123"}, {"key": "DB1", "value": "TestDB"}, {"key": "OS", "value": "EX1"}]