I have some SQL that runs quickly on its own, but is 60 times slower when the same SQL is inside a stored proc. The SQL uses temp tables in many of its queries. Why is it slower in a sp?
CodePudding user response:
Sybase recommends creating the temp tables outside of the stored proc. Also some solutions if you create indices in certain ways. See Sybase docs for more specifics
CodePudding user response:
Does you stored procedure has parameters?
If yes, SQL Server uses a process called parameter sniffing when it executes stored procedures that have parameters. When the procedure is compiled or recompiled, the value passed into the parameter is evaluated and used to create an execution plan. That value is then stored with the execution plan in the plan cache. On subsequent executions, that same value – and same plan – is used.
What happens when the values in a table you’re querying aren’t evenly distributed? What if one value would return 10 rows and another value would return 10,000 rows, or 10 million rows? What will happen is that the first time the procedure is run and the plan is compiled, whatever value is passed in is stored with the plan. Every time it’s executed, until it’s recompiled, the same value and plan will be used – regardless of whether it is the fastest or best plan for that value.
You can force SQL Server to recompile the stored procedure each time it is run. The benefit here is that the best query plan will be created each time it is run. However, recompiling is a CPU-intensive operation. This may not be an ideal solution for stored procedures that are run frequently, or on a server that is constrained by CPU resources already.
ALTER PROCEDURE SP_Test
@ProductID INT
WITH RECOMPILE
AS
SELECT OrderID, OrderQty
FROM SalesOrderDetail
WHERE ProductID = @ProductID