Stored Procedure optimization is an art mixed with some techniques and needs lots of patience with an eye for detail at every aspect.In this series of articles, we will provide some of the measurements that will ensure for a better performant Store Procedure.
Stored Procedure optimization is an art mixed with some techniques and needs lots of patience with an eye for detail at every aspect.In this article, we will provide some of the measurements that will ensure for a better performant Store Procedure. We have already started our journey in the previous article where we have seen some of the mesaurement/ways for writting better Stored Procedure. In this article we will look into some more tips for the same. While providing the tips, we will also provide the references that will act as a detail reference point as why the measurement is consider as a better technique.
The Bulleted Points
Instead of creating several non-clustered index, try using Covering Index because it can satisfy all requested columns in a query without performing a further lookup into the clustered index.If all the columns requested in the select list of query, are available in the index, then the query engine doesn't have to lookup the table again which can significantly increase the performance of the query. Since all the requested columns are available with in the index, the index is covering the query. It enhances the query performance by avoiding unnecessary lookups. A good read for the same is available here
- Avoid distinct clause for limiting rows. Better use the Ranking functions like ROW_NUMBER. For a more detail explanation, please read our article on the same. The article provides a good insight about the various usage of ROW_NUMBER function.
Try to keep small transactions since it locks the processing tables and may sometime also cause deadlocks.
For pagination, we can use Fetch/Offset clause provided from SQL Server 2012 (Code Name: Denali). For a more detail explanation, please read our article on the same. The article not only talks about the usage of FETCH/OFFSET but also proves why it is better than the other available approaches.
Use judiciously Force Index Query Hints
Instead of writing
Group By Col1,Col2,Col3
COUNT(Col4) OVER (PARTITION BY <Your partition by Columns> ORDER BY (Select 1)),
SUM(Col5) OVER (PARTITION BY <Your partition by Columns> ORDER BY (Select 1))
Here we are treating Count/Sum as analytical function instead of aggreate function.
The disadvantage of using the first approach is that we need to group by all other non-aggregate columns which some times is not desirable and yields wrong results/not desired ones. The second approach is better as we have the choice of grouping the columns by our own choice and at a case by case basis.
Avoid using UNPIVOT
ID, MAX(COL) AS Max_Value
SELECT ID, COL
UNPIVOT (COL FOR COLS IN (COL1,COL2,COL3)) AS UNPIVOTT
) AS P
GROUP BY ID
and use Values clause
SELECT MAX(n) FROM
) AS T(N)
) AS Max_Value
because the former yields correlated LEFT JOIN. A more detail explanation about the same can be obtain from here.
If possible avoid using triggers for writing the business logic.
Instead of creating very complicated logic which may not be optimized at the TSQL level, we can always take the advantage of CLR functions. For a more detail explanation, please read our article on the same as how we can create our own CLR functions. An ideal situation for using a custom CLR function might be parsing the JSON objects. A nice article detailing the same is available at here.
For running dynamic queries, avoid EXEC and adopt sp_executesql. sp_executesql is better because
- It mitigates the risk of SQL injection
- For dynamic SQL too, the plan gets re-used while using sp_executesql
For slow running indexes, we may check about their fragmentation statistics. Over the period of time the indexes may be fragmented. We can figure them out by running a this query. Once figured out, we need to rebuild them. Look into this article to get a nice idea about the same.
Instead of Like operator, we can use Patindex. Here is an article which measures the performance of between the same.
Having said all of the above, it's time now to look at the query behaviours and to find out the expensive ones.
To start measuring the queries, run the following statements in SQL Server Management Studio:
SET STATISTICS IO ON;
SET STATISTICS TIME ON;
STATISTICS IO : Turning on the STATISTICS IO will print information about how SQL Server is interacting with the storage when we run queries. The most useful numbers we will get are:
STATISTICS TIME : Truning On the STATISTICS TIME setting will print messages explaining how much CPU time was used when executing the query. A more detail explanation can be found at here
Also observe that the Query Execution plan will reveal the query behaviour and also suggest hints upon which we can work on.
Role of Dynamic Management Views(DMV)
At times we need to find out as which queries are taking more time to execute. For that, we can take help of DMV's e.g
We can find the most expensive queries by using the DMV's
Some tools worth mentioning at this juncture are
SQL Server Profiler : is a rich interface to create and manage traces and analyze and replay trace results. The events are saved in a trace file that can later be analyzed or used to replay a specific series of steps when trying to diagnose a problem. A article worth mentioning at this point to go through is available here.
Database Engine Tuning Advisor : A tool that helps you figure out if additional indexes are helpful as well as partitioning.
SQLQueryStress: It is a free tool when comparing a before and after scenario of stored procedures. With SQLQueryStress we can execute each stored procedure as many times as we like and get the total average stats for all executions.
- Top 10 CPU, Disk IO, Memory consuming queries
- Performance Tuning with SQL Server Dynamic Management Views
N.B.~ Having said all those, nothing beats real time experience which will come over the period of time with passion, practice, an eye for details,situation and patience.
Though we are concluding our series for "The Art of Stored Procedure Optimization" here but stored procedure optimization is a never ending process. To conclude with, we must quote
" I know not what I appear to the world, but to myself I seem to have been only like a boy playing on the sea-shore, and diverting myself in now and then finding a smoother pebble or a prettier shell, whilest the great ocean of truth lay all undiscovered before me. " - Isaac Newton