The Art of Stored Procedure Optimization - Last but not the least

Rajnilari2015
Posted by in Sql Server category on for Beginner level | Points: 250 | Views : 1082 red flag
Rating: 5 out of 5  
 2 vote(s)

Stored Procedure optimization is an art mixed with some techniques and needs lots of patience with an eye for detail at every aspect.In this series of articles, we will provide some of the measurements that will ensure for a better performant Store Procedure.

Introduction

Stored Procedure optimization is an art mixed with some techniques and needs lots of patience with an eye for detail at every aspect.In this article, we will provide some of the measurements that will ensure for a better performant Store Procedure. We have already started our journey in the previous article where we have seen some of the mesaurement/ways for writting better Stored Procedure. In this article we will look into some more tips for the same. While providing the tips, we will also provide the references that will act as a detail reference point as why the measurement is consider as a better technique.

The Bulleted Points

  1. Instead of creating several non-clustered index, try using Covering Index because it can satisfy all requested columns in a query without performing a further lookup into the clustered index.If all the columns requested in the select list of query, are available in the index, then the query engine doesn't have to lookup the table again which can significantly increase the performance of the query. Since all the requested columns are available with in the index, the index is covering the query. It enhances the query performance by avoiding unnecessary lookups. A good read for the same is available here

  2. Avoid distinct clause for limiting rows. Better use the Ranking functions like ROW_NUMBER. For a more detail explanation, please read our article on the same. The article provides a good insight about the various usage of ROW_NUMBER function.
  3. Try to keep small transactions since it locks the processing tables and may sometime also cause deadlocks.

  4. For pagination, we can use Fetch/Offset clause provided from SQL Server 2012 (Code Name: Denali). For a more detail explanation, please read our article on the same. The article not only talks about the usage of FETCH/OFFSET but also proves why it is better than the other available approaches.

  5. Use judiciously Force Index Query Hints

  6. Instead of writing

    Select 
    	Col1,Col2,Col3,
    	Count(Col4),
    	Sum(Col5)
    From Table
    Group By Col1,Col2,Col3	
    		

    Write

    Select 
    	Col1,Col2,Col3,
    	COUNT(Col4) OVER (PARTITION BY <Your partition by Columns> ORDER BY (Select 1)),
    	SUM(Col5) OVER (PARTITION BY <Your partition by Columns> ORDER BY (Select 1))
    From Table
    		

    Here we are treating Count/Sum as analytical function instead of aggreate function.

    The disadvantage of using the first approach is that we need to group by all other non-aggregate columns which some times is not desirable and yields wrong results/not desired ones. The second approach is better as we have the choice of grouping the columns by our own choice and at a case by case basis.

  7. Avoid using UNPIVOT

     
    SELECT 
    ID, MAX(COL)  AS Max_Value 
    FROM
    ( 
    	SELECT ID, COL 
    	FROM @t 
    	UNPIVOT (COL FOR COLS IN (COL1,COL2,COL3)) AS UNPIVOTT
    ) AS P 
    GROUP BY ID
    			

    and use Values clause

    SELECT ID,
    (
    	SELECT MAX(n) FROM 
    	(
    	   VALUES(COL1),(COL2),(COL3)
    	) AS T(N)
    )  AS Max_Value 
    FROM @t
    			

    because the former yields correlated LEFT JOIN. A more detail explanation about the same can be obtain from here.

  8. If possible avoid using triggers for writing the business logic.

  9. Instead of creating very complicated logic which may not be optimized at the TSQL level, we can always take the advantage of CLR functions. For a more detail explanation, please read our article on the same as how we can create our own CLR functions. An ideal situation for using a custom CLR function might be parsing the JSON objects. A nice article detailing the same is available at here.

  10. For running dynamic queries, avoid EXEC and adopt sp_executesql. sp_executesql is better because

    1. It mitigates the risk of SQL injection
    2. For dynamic SQL too, the plan gets re-used while using sp_executesql
  11. For slow running indexes, we may check about their fragmentation statistics. Over the period of time the indexes may be fragmented. We can figure them out by running a this query. Once figured out, we need to rebuild them. Look into this article to get a nice idea about the same.

  12. Instead of Like operator, we can use Patindex. Here is an article which measures the performance of between the same.

Having said all of the above, it's time now to look at the query behaviours and to find out the expensive ones.

To start measuring the queries, run the following statements in SQL Server Management Studio:

	SET STATISTICS IO ON;
	SET STATISTICS TIME ON;
	GO

STATISTICS IO : Turning on the STATISTICS IO will print information about how SQL Server is interacting with the storage when we run queries. The most useful numbers we will get are:

	Logical Reads: 
	Physical Reads: 
	Read-Ahead Reads:

STATISTICS TIME : Truning On the STATISTICS TIME setting will print messages explaining how much CPU time was used when executing the query. A more detail explanation can be found at here

Also observe that the Query Execution plan will reveal the query behaviour and also suggest hints upon which we can work on.

Role of Dynamic Management Views(DMV)

At times we need to find out as which queries are taking more time to execute. For that, we can take help of DMV's e.g

We can find the most expensive queries by using the DMV's

SYS.DM_EXEC_QUERY_STATS,SYS.DM_EXEC_QUERY_PLAN

Some tools worth mentioning at this juncture are

SQL Server Profiler : is a rich interface to create and manage traces and analyze and replay trace results. The events are saved in a trace file that can later be analyzed or used to replay a specific series of steps when trying to diagnose a problem. A article worth mentioning at this point to go through is available here.

Database Engine Tuning Advisor : A tool that helps you figure out if additional indexes are helpful as well as partitioning.

SQLQueryStress: It is a free tool when comparing a before and after scenario of stored procedures. With SQLQueryStress we can execute each stored procedure as many times as we like and get the total average stats for all executions.

References

  1. Top 10 CPU, Disk IO, Memory consuming queries
  2. Performance Tuning with SQL Server Dynamic Management Views

N.B.~ Having said all those, nothing beats real time experience which will come over the period of time with passion, practice, an eye for details,situation and patience.

Conclusion

Though we are concluding our series for "The Art of Stored Procedure Optimization" here but stored procedure optimization is a never ending process. To conclude with, we must quote

" I know not what I appear to the world, but to myself I seem to have been only like a boy playing on the sea-shore, and diverting myself in now and then finding a smoother pebble or a prettier shell, whilest the great ocean of truth lay all undiscovered before me. " - Isaac Newton
Page copy protected against web site content infringement by Copyscape

About the Author

Rajnilari2015
Full Name: Niladri Biswas (RNA Team)
Member Level: Platinum
Member Status: Member,MVP
Member Since: 3/17/2015 2:41:06 AM
Country: India
-- Thanks & Regards, RNA Team


Login to vote for this post.

Comments or Responses

Posted by: Sheonarayan on: 3/16/2016 | Points: 25
Great article Rajnilari2015, thanks for sharing.
Posted by: Rajnilari2015 on: 3/16/2016 | Points: 25
@Sheonarayan Sir, thanks a lot

Login to post response

Comment using Facebook(Author doesn't get notification)