I'm trying to import multiple json files into my postgres Database with the following code.
import json
import psycopg2
con = psycopg2.connect(database="kw_du", user="postgres", password="f67dd8hzFE5",host="localhost")
cur = con.cursor()
with open(r"C:\Users\diego\OneDrive\Desktop\Workplace\220906\nu_2022_09_01.json", 'r') as data_file:
data = json.load(data_file)
with con.cursor() as cur:
cur.executemany('INSERT INTO test_nu(vp_ip, system_activ, vpm_activ) VALUES(%s)', [(json.dumps(d),) for d in data])
con.commit()
con.close()
The json file has the following structure:
[{"vp_id":1,"system_activ":1,"vpm_activ":1},{"vp_id":2,"system_activ":2,"vpm_activ":2}]
When executing the code i get the following error message:
line 13, in cur.executemany('INSERT INTO test_nu (vp_id, system_aktiv) VALUES(%s)', [(json.dumps(d),) for d in data]) psycopg2.errors.SyntaxError: ERROR: INSERT has more target columns than expressions LINE 1: INSERT INTO test_nu (vp_id, system_aktiv) VALUES('{"vp_id": ...
I've seen, that the ERROR: INSERT has more target columns than expressions can have varios reasons, I was not able to solve this problem. Could the ' or {} be a problem? It seems like the json objectr doesn't get parsed VALUES('{"vp_id": ... Appreciate any hint.
CodePudding user response:
You can use json_populate_recordset
.
Refer sample query below -
insert into test_nu(vp_id, system_activ, vpm_activ) select * from json_populate_recordset(NULL::test_nu,'[{"vp_id": 1,"system_activ":"eewrw","vpm_activ":"qeqerqr"}]')
Fiddle here.
CodePudding user response:
You can save a number of round-trips to the database with the following:
with expand as (
select (j->>'vp_id')::int as vp_ip,
(j->>'system_activ')::int as system_activ,
(j->>'vpm_activ')::int as vpm_activ
from jsonb_array_elements((%s)::jsonb) as el(j)
)
insert into test_nu (vp_id, system_activ, vpm_activ)
select vp_ip, system_activ, vpm_activ
from expand;
Pass json.dumps(data)
to this query.