We have a table that has more than 20 million records and has more than 50 columns. I recently added a new column to it of type bit
. After my change was done, some of the stored procedures that used this table were performing poorly. The DBA asked me to run the SP_Recompile 'tableName' command to update the table statistics. After I did that, the procedures were performing well. Could someone please explain what happens when a table is altered and a new column is added? How does it affect the performance?
CodePudding user response:
This is actually explained in the documentation.
Firstly, sys.sp_recompile N'{Table Name}';
doesn't update the statistics of the table. From the documentation:
If object is the name of a table or view, all the stored procedures, triggers, or user-defined functions that reference the table or view will be recompiled the next time that they are run.
Recompiling means that the next time the query plan for the query is regenerated; the old cached one is not used. Why you would want to do this, is also discussed in the documentation:
The queries used by stored procedures, or triggers, and user-defined functions are optimized only when they are compiled. As indexes or other changes that affect statistics are made to the database, compiled stored procedures, triggers, and user-defined functions may lose efficiency. By recompiling stored procedures and triggers that act on a table, you can reoptimize the queries.
When you altered the table, you can have affects the statistics. Also, however, so do just day to day usage where rows and inserted, updated, and deleted. It appears that this was the case here, and the plan the procedures were using weren't the most efficient now. Forcing them to recompile means that they can use the new statistics and a new (hopefully more efficient) plan.