I have a dictionary with keys and values like:
my_dict = {'a':33, 'b': 'something', 'c': GETDATE(), 'd': 55}
Assume column names in the SQL table are also named like the keys of the dict, i.e. "a,b,c,d".
The actual dictionary is 20 key:value pairs.
Code
I have used pyodbc.connect to create a cursor
which I could use to execute an SQL INSERT statement:
for k in my_dict.keys():
cursor.execute(
'''
INSERT INTO TABLEabc (%s)
VALUES (%s)
'''
% (k, my_dict[k])
)
This seems inefficient though because it's a new SQL operation each time.
- What is the easiest way to insert the values using a loop?
- How could I write it so that it just makes one insert with all the values?
CodePudding user response:
If you're using pyodbc then this might work:
columns = {row.column_name for row in cursor.columns(table='TABLEabc')}
safe_dict = {key: val for key, val in my_dict.items() if key in columns}
# generate a parameterised query for the keys in our dict
query = "INSERT INTO TABLEabc ({columns}) VALUES ({value_placeholders})".format(
columns=", ".join(safe_dict.keys()),
value_placeholders=", ".join(["?"] * len(safe_dict)),
)
cursor.execute(query, list(safe_dict.values()))
It is intended to be safe from SQL injection because:
- we filter for only keys which are actual column names in the db
- we use pyodbc cursor execute params, so the values will be escaped properly
Where it possibly won't work:
- if any of the column names need to be quoted and escaped, this won't happen automatically so it will fail
Quoting/escaping is db-specific so we would have to check the rules for our actual db and apply that to the dict keys that we format into the query. (or find some way to get pyodbc to do that for us, not sure if possible)
CodePudding user response:
Using SQLAlchemy, this dictionary matches my table construct in SQL Server
mydict = {
"Name": "Laura", "Form": "4B","Year": "7", "DOB" : "", "Col_5" : "",
"Col_6" : "","Col_7" : "","Col_8" : ""
}
print(mydict)
and I use this to pass the keys and values to a string (with some manipulation) into my table, which is called pypupil) I also wanted to write data for all records, whether present in the Dictionary or not.
fields = (str(list(mydict.keys()))[1:-1])
values = (str(list(mydict.values()))[1:-1])
columns = ', '.join("`" str(x).replace('/', '_') "`" for x in mydict.keys())
values = ', '.join("'" str(x).replace('/', '_') "'" for x in mydict.values())
sql = "INSERT INTO %s ( %s ) VALUES ( %s );" % ('pypupil', columns, values)
sql = sql.replace ( "`","")
print(sql)
with engine.connect() as con:
rs = con.execute(sql)
A bit more info and lots of helpers for this approach here. https://discuss.dizzycoding.com/using-a-python-dict-for-a-sql-insert-statement/
CodePudding user response:
Supplemental to JonTout's answer, if you use SQLAlchemy then your dict is ready-made to be consumed by a SQLAlchemy Core Table
object:
import datetime
import sqlalchemy as sa
connection_url = sa.engine.URL.create("mssql pyodbc", … )
engine = sa.create_engine(connection_url)
table_abc = sa.Table("TABLEabc", sa.MetaData(), autoload_with=engine)
my_dict = {'a':33, 'b': 'something', 'c': datetime.date.today(), 'd': 55}
with engine.begin() as conn:
conn.execute(table_abc.insert(), my_dict)