Table get locked when called an SQL Server SP from pyodbc Python
I have a table I made for testing called test. I just want to see if my code works. My python code is very simple and only calls one SP
Here is my SQL Server SP script:
ALTER PROCEDURE [dbo].[TestService]
AS
BEGIN
SET NOCOUNT ON;
declare @C1 int
BEGIN TRANSACTION;
insert into Test (a1) values ('Service')
COMMIT TRANSACTION;
SELECT @C1 = COUNT(*) FROM test (nolock)
SELECT GETDATE(), @C1 as t
END
and my python code is this:
import pyodbc
import pandas as pd
#df_results.drop(axis=0, inplace=True)
ConnectionString = "DRIVER={SQL Server};Server=Serv;Database=DB;User Id=user;Password=*****;"
conn = pyodbc.connect(ConnectionString)
df_results = pd.read_sql("EXEC TestService" , conn)
print(df_results)
Before running the python code I ran this select
SELECT * FROM Test (NoLock)
and the output was 4 records
I ran the python script and I got this output
0 2021-12-19 00:09:36.887 5
which means record been inserted and total number of records is 5
but when I run the
SELECT * FROM Test (NoLock)
I still get 4 records only
and when I try
SELECT * FROM Test
I get timed out.
How to fix that?
CodePudding user response:
read_sql
won't commit the transaction. You need to explicitly commit it.
conn = pyodbc.connect(ConnectionString)
df_results = pd.read_sql("EXEC TestService" , conn)
print(df_results)
conn.commit()
conn.close()