enter image description here I am trying to insert a csv file into database by python. The database is oracle . Below is my code csv file name is PatientStatus and Table name is AIW1 import cx_Oracle as cn import pandas as pd # pip install pandas
```
df = pd.read_csv('PatientStatus.csv')
columns = [ 'UNITNAME' ,
'PATIENTSTATUS', 'PATIENTCOUNTS', 'COUNTRY','STATENAME','CITYNAME']
df_data = df[columns]
records = df_data.values.tolist()
conn = cn.connect(user="", password="",dsn="")
sql_insert = '''
INSERT INTO AIW1
VALUES (?, ?, ?, ?, ?, ?, ?,?,?)
'''
cursor = conn.cursor()
cursor.executemany(sql_insert, records)
cursor.commit();
print('Task is complete.')
cursor.close()
conn.close()
```
I am getting the
DatabaseError
Traceback (most recent call last) C:\Users\SOFTWA~1.SUP\AppData\Local\Temp/ipykernel_956/3684779769.py in 21 ''' 22 cursor = conn.cursor() ---> 23 cursor.executemany(sql_insert, records) 24 cursor.commit(); 25 print('Task is complete.')
DatabaseError: ORA-01036: illegal variable name/number
Thanks in advance
CodePudding user response:
In Oracle, you need to use a different bind placeholder syntax, not '?'.
Look at the cx_Oracle documentation for an example of efficiently loading data, see Batch Statement Execution and Bulk Loading. You can adjust your data to a similar format if you still want the overhead of using Pandas.
import cx_Oracle
import csv
# Predefine the memory areas to match the table definition.
# This can improve performance by avoiding memory reallocations.
# Here, one parameter is passed for each of the columns.
# "None" is used for the ID column, since the size of NUMBER isn't
# variable. The "25" matches the maximum expected data size for the
# NAME column
cursor.setinputsizes(None, 25)
# Adjust the number of rows to be inserted in each iteration
# to meet your memory and performance requirements
batch_size = 10000
with open('testsp.csv', 'r') as csv_file:
csv_reader = csv.reader(csv_file, delimiter=',')
sql = "insert into test (id,name) values (:1, :2)"
data = []
for line in csv_reader:
data.append((line[0], line[1]))
if len(data) % batch_size == 0:
cursor.executemany(sql, data)
data = []
if data:
cursor.executemany(sql, data)
con.commit()
CodePudding user response:
Please edit your answer and show the full traceback error message