Home > Enterprise >  Indice issue loading lists into Postgres with python
Indice issue loading lists into Postgres with python

Time:08-19

My code takes input from a json file and parses out the json object into lists this works when I try to insert those lists into the sql with the executemany command it is throwing this error

 string indices must be integers

Checking with print statements shows all the lists are, in fact lists and loaded correctly not sure what to do to proceed

here is my code

 def insert_item_manifest(location_name_name, item_json):  

     item_dictionary = json.loads(item_json)
     item_name_list = list(item_dictionary.keys())

     vals = []

     for i in range(1, len(item_name_list)   1):
         vals.append({'location_name_name': location_name_name,
                      'item_name': item_name_list[i-1],
                      'item_unique_id': item_dictionary[item_name_list[i-1]][0],
                      'item_address': item_dictionary[item_name_list[i-1]][1]
                      })

     sql = ''' INSERT INTO test.public.item_manifest
   (location_name, item_name, item_unique_id, item_address)
     VALUES
        (%(location_name_list)s, %(item_name_list)s, %(item_unique_id_list)s, %(item_address_list)s)'''  

     conn = None

     try:
         # read database configuration
         params = setparams()
         # connect to the PostgreSQL database
         conn = psycopg2.connect(**params)
         # create a new cursor
         cur = conn.cursor()
         # execute the INSERT statement
         cur.executemany(sql, vals)
         # commit the changes to the database
         conn.commit()
         # close communication with the database
         cur.close()
     except (Exception, psycopg2.DatabaseError) as error:
         print(error)
     finally:
         if conn is not None:
              conn.close()

the generated SQL looks like this:

  INSERT INTO test.public.item_manifest
(location_name, item_name, item_unique_id, item_address)
VALUES
(%(location_name)s, %(item_name)s, %(item_unique_id)s, %(item_address)s)

tutorial

here is the json

      {'itemUniqueName1': ['item_ID_Number1', '<itemAddress1>', 'BooleanActive'], 'itemUniqueName2': ['item_ID_Number2', '<itemAddress2>', 'BooleanActive'], 

CodePudding user response:

A minimal example.

Create table:

create table items_test(id integer, fld_1 varchar);

Python code:

import json
import psycopg2

vals = json.loads('[{"id": 1, "fld_1": 2}, {"id": 3, "fld_1": 4}, {"id": 5, "fld_1": 6}]')

# vals                                                                                                                                                                      
# [{'id': 1, 'fld_1': 2}, {'id': 3, 'fld_1': 4}, {'id': 5, 'fld_1': 6}]

sql = "insert into items_test(id, fld_1) values(%(id)s, %(fld_1)s)"

cur.executemany(sql, vals)
con.commit()

In psql:

select * from items_test ;
 id | fld_1 
---- -------
  1 | 2
  3 | 4
  5 | 6


  • Related