Monday, March 30, 2015

My Report takes long time to load, where should I check initially?

If you are the DBA or DBE, and reports have been deployed in production environment, question like above is very common? Usually quotes are something like "Report has been working fine, suddenly it takes long time for same data load", "My report usually takes 2-3 minutes now it takes more than 5 minutes". Now being a DBA or DBE, what should we check? CPU usage? Memory consumption? Or query related to the report?

True, we have to check all the listed area, but most of the time, if it is an unexpected and unusual slowness, I think we should checks for blocks too, may be the first thing just after CPU and memory usage. Why should we check?

Generally, if something is being updated, SQL Server places exclusive locks on objects. It can be on a row, page, extent or entire table. If the dataset related to the report is being locked by an update process, then the report query has to wait unless default timeout period has been set (then it throws an error and stop waiting) or different Isolation Level has been set. Therefore it is always better to check whether something has blocked our code or not before starting the analysis of codes.

How do we check it? Simplest way is Activity Monitor. It indicates Blocking and query that is blocking can be viewed too. If you need more info, then can try with set of TSQL statements.

Let's see how we can quickly check something like that. Open an connection and execute the following code.

  1. USE AdventureWorks2014;  
  2. GO  
  3.   
  4. BEGIN TRAN  
  5.  UPDATE Production.Product  
  6.   SET Color = ''  
  7.  WHERE ProductID = 1;  

Now open a different connection and execute the code below. It tries to access the same table which is being modified by first connection, hence you should not see a result of it until the transaction of the first connection either committed or rolled back.

  1. USE AdventureWorks2014;  
  2. GO  
  3.   
  4. SELECT * FROM Production.Product;  
  5. GO  

Now let's open the Activity Monitor and see. It can be opened by Right-Clicking the server instance and selecting Activity Monitor. Here is the screen.


As you see, second query is being run with SID 57 and this indicates that it is being blocked by 56. If you see something like this, what it says is, your report is blocked by some other processes. Activity Monitor allows us to see the code related to other process, in this case code related to 56. All we have to do is, right-click on 56 and click on Details.


It allows us to Kill the process but should not do it unless we are 100% sure that it does not disturb anyone or damage the dataset. But the good thing is, this shows the exact issue for the slowness and we can act on it in an appropriate manner.

  1. SELECT r.session_id, r.status, r.blocking_session_id,  
  2.  r.command, r.wait_type, r.wait_time, t.text BlockedCode  
  3. FROM sys.dm_exec_requests  AS r  
  4.  CROSS APPLY sys.dm_exec_sql_text(r.sql_handle) AS t  
  5. WHERE r.blocking_session_id  > 1;  
  6. GO  


Here is the way of seeing Blocking Code.

  1. SELECT t.text BlockingCode  
  2. FROM sys.dm_exec_connections  AS c  
  3.  CROSS APPLY sys.dm_exec_sql_text(c.most_recent_sql_handle) AS t  
  4. WHERE c.session_id  = (SELECT DISTINCT blocking_session_id  
  5.    FROM sys.dm_exec_requests  AS r  
  6.    WHERE NOT EXISTS (SELECT 1  
  7.     FROM sys.dm_exec_requests r2  
  8.     WHERE r.blocking_session_id  = r2.session_id  
  9.      AND r2.blocking_session_id  > 0)  
  10.     AND r.blocking_session_id  > 0);  


If need to kill a process using TSQL, kill command along with SID can be used.



Sunday, March 29, 2015

SQL Server Brain Basher of the Week #005

Can I use sp_recompile with tables?

What would be your answer? Yes or no?

Answer should be Yes but it does not mean that we can recompile tables using this procedure.

Generally we use sp_recompile for invalidating existing plans in the cache for stored procedures, triggers or user-defined functions and plans will be regenerated with next run. This requires one parameter: Name of the procedure, user-defined function or trigger. Remember, this does not recompile procedures, triggers, and user-defined functions, it just marks existing plans as invalid. New plan will be created when we use the object again.

This allows us to pass a table name or view name as the parameter value. If the parameter value is either a table name or view name, it invalidates all existing plans of procedures, user-defined functions and triggers that have reference to the table or view. Similarly, new plans will be generated at the next run of referenced objects.

Let's test this. Execute the first statement few times and then execute the next statement. First statement executes a stored procedure in AdventureWorks2014 databases and second statement checks for Compiled Plans in the Cache.

  1. -- Cleaning buffer  
  2. DBCC FREEPROCCACHE  
  3. DBCC DROPCLEANBUFFERS  
  4. GO  
  5.      
  6. USE AdventureWorks2014;  
  7. GO  
  8.   
  9. -- First staement  
  10. -- Execute this three times  
  11. EXEC dbo.GetFrieghts 2011  
  12.   
  13. -- Second statement  
  14. -- We should see 3 counts on compiled plan  
  15. SELECT p.usecounts, p.cacheobjtype, p.objtype, t.text  
  16. FROM sys.dm_exec_cached_plans p  
  17. CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) t  
  18. WHERE  
  19.  TEXT LIKE '%GetFrieghts%';  


Now, let's call sp_recompile for the table Sales.SalesOrderHeader which has been referenced by dbo.Getfrieghts procedure.

  1. EXEC sp_recompile N'Sales.SalesOrderHeader';  

Once done, execute the stored procedure once and see whether engine uses the same plan or a new plan.

  1. -- Execute again - separately  
  2. EXEC dbo.GetFrieghts 2011  
  3.   
  4. -- Check compiled plans  
  5. SELECT p.usecounts, p.cacheobjtype, p.objtype, t.text  
  6. FROM sys.dm_exec_cached_plans p  
  7. CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) t  
  8. WHERE  
  9.  TEXT LIKE '%GetFrieghts%';  


Surprise? Has the plan been invalidated and new plan has been created? It looks like no, old plan has been used again. Though we see the result looks like that, the reality is different. It has recompiled the stored procedure but not fully, only statements that have reference to the table. Let's test it.

Execute the first code set again. Once executed we have a compiled plan in the cache and it has been hit 3 times. Next, start the Profiler and add SP:CacheHit and SP:Recompile.


Run the Profiler. Execute the second code set that calls sp_recompile with the table and execute the stored procedure again.. Once executed, check the Profiler. This is what you should see.


As you see, two statements that have a reference to the table have been recompiled. Since the entire stored procedure has not been recompiled, the count of old compiled plan is getting increased, but remember, few parts of the procedure has been recompiled.

Thursday, March 26, 2015

My Application needs a new sequence number to the table without inserting the record

This is something I discussed with one of my colleagues, it was all about obtaining the next sequence number from Identity type column without inserting the record and use it for other entities inside the application, and later submit all to the database for insertion. However, if this is the requirement, it is not possible with Identity because the next value cannot be obtained from Identity-Column without inserting the record. This issue can be seen often with Entity Framwork enabled applications.

What would be the best way of handling this? Traditional way of handling this is, maintaining another table that holds key values for tables and using it for issuing next available values. It needs additional coding, extra effort and it introduces unexpected locking issues too. Modern way and best way of handling this is, using SEQUENCE object which was introduced with SQL Server 2012.

The SEQUENCE is a user-defined schema-bound object that generates sequence numbers based on the setting added. Multiple properties can be set with this such as Starting number, interval, and order. Unlike Identity, defined SEQUENCE object can be used with any table because there is no link between SEQUENCE object and a table. Although one SEQUENCE object can be used with all tables, it is advisable to create multiple SEQUENCE objects, one per table because it makes the maintainability easy and minimizes gaps between numbers added to tables.

Here is the way of creating a SEQUENCE object.

  1. CREATE SEQUENCE Sales.OrderSequence  
  2.     START WITH 1  
  3.     INCREMENT BY 1;  
  4. GO  

Once created, application can request for the next number which can be used as key of the table, calling below command;

  1. SELECT NEXT VALUE FOR Sales.OrderSequence;  

For more info on SEQUENCE object, refer: https://msdn.microsoft.com/en-us/library/ff878091.aspx



Tuesday, March 24, 2015

I declared a nvarchar(max) variable but it holds only 4000 characters

Although getting questions from SQL Server enthusiasts makes me busy, this brushes up my knowledge significantly. This is one of the questions I got today.

If the TSQL statement has to be dynamically built, the most common approach we use for forming the statement is, declaring a variable with either varchar(max) or nvarchar(max) data type and adding necessary string values for forming the statement. If you are a person who always maintain strict discipline on coding format, then you will be properly forming the statement but if you are bit lazy and want to form the statement without considering the format much, you may face the issue mentioned with the title.

Let me explain how this can happen. Have a look on below code.

  1. DECLARE @Sql nvarchar(max)  
  2.   
  3. SET @Sql = N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  4.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  5.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  6.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  7.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  8.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  9.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  10.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  11.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  12.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  13.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  14.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  15.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  16.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  17.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  18.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  19.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,  This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters'  
  20.      
  21. SELECT LEN(@Sql);  
  22. -- Result - 5111  

If you execute the code, you get the result as 5111. It shows all characters added to the @Sql variable. Let's inject an additional character between these lines (That is how we form the dynamic SQL statement).

  1. DECLARE @Sql nvarchar(max)  
  2.   
  3. SET @Sql = N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  4.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  5.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  6.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  7.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  8.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  9.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  10.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, '  
  11.      
  12.    + N' ' +  
  13.   
  14.    N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  15.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  16.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  17.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  18.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  19.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  20.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  21.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  22.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,  This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters'  
  23.      
  24. SELECT LEN(@Sql);  
  25. -- Result - 4000  

If you execute the code now, the result is not as we expected, it shows the count as 4000, instead of 5112. Obviously, reason is truncation but how it happened is the question. 

The reason is completely based on logic used with string concatenation. With our second code, we concatenate three Unicode literals. When literals or expressions are concatenated, if the size of it is less than 4000 (it is 8000 for non-Unicode), then it is treated as nvarchar(Nof Characters). If the value contains more than 4000 characters, then it is treated as nvarchar(max). Considering this logic, all our three literals are treated as nvarchar(Nof Characters), hence concatenation will be something like below;

@SQL = nvarchar(n) + nvarchar(n) + nvarchar(n)

Since no literal has been treated as nvarchar(max), the result of this will be limited 4000,  as a result of truncation.

How do we overcome this situation? There are two easy ways of writing this properly. One is, contatenate string using SET statement. This makes sure that we always concatenate the literal with nvarchar(max), hence result is nvarchar(max). The second way is, convert the literal explicitly to nvarchar(max) before concatenating. Here is the sample code.

  1. DECLARE @Sql nvarchar(max)  
  2.   
  3. SET @Sql = N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  4.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  5.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  6.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  7.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  8.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  9.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  10.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, '  
  11.      
  12. SET @Sql += N' '   
  13.   
  14. SET @Sql += N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  15.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  16.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  17.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  18.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  19.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  20.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  21.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  22.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,  This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters'  
  23.      
  24. SELECT LEN(@Sql);  
  25. -- Result - 5107  
  26. GO  
  27.   
  28. DECLARE @Sql nvarchar(max)  
  29.   
  30. SET @Sql = CONVERT(nvarchar(max), N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  31.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  32.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  33.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  34.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  35.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  36.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  37.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, ')  
  38.      
  39.    + CONVERT(nvarchar(max), N' ') +  
  40.   
  41.    CONVERT(nvarchar(max), N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  42.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  43.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  44.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  45.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  46.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  47.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  48.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,   
  49.    This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,  This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters')  
  50.      
  51. SELECT LEN(@Sql);  
  52. -- Result - 5107  

Now you know the reason for this, so, do not make this mistake when forming the statement dynamically.

Sunday, March 22, 2015

SQL Server Brain Basher of the Week #004

Here is another tricky question? It is on Indexing.

Can we create an index without mentioning what column(s) to be used?

Obviously, we have to expect the answer as No, because index requires one or more columns for organizing the index; the index key. But this is possible with SQL Server 2014 :).

How is this possible? Earlier we had two types of indexes; Clustered Index and Non-Clustered Index. With SQL Server 2012, a new classification was introduced; Rowstore Index and Columnstore Index. Rowstore index represents the indexes we used to create with rows (or records) and Columnstore index represents a new way of creating indexes on columns. Only non-clustered columnstore indexes were supported with SQL Server 2012 which require column(s) to be specified.

Though 2012 does not support creating clustered columnstore indexes, SQL Server 2014 allows creating clustered columnstore indexes with tables. Once the table is set with a clustered columnstore index, no other indexes are possible with the same table. Clustered columnstore index does NOT require a column (or columns) to be specified when creating, hence we can create an index without specifying columns. So, the answer for above question is Yes.

Here is an example for Clustered Columnstore Index.

  1. CREATE CLUSTERED COLUMNSTORE INDEX IX_Sales  
  2.  ON Sales;  



Saturday, March 21, 2015

How to get system/error messages with severity level lower than 19 logged to Application Log

Have you noticed that SQL Server does not send all messages to Application Log? The default setting does not send all messages, generally messages with severity level between 19 and 25 are automatically written but not all. The best way to check which messages that are getting logged, is checking sys.messages catalog view. This has all messages and the column is_event_logged indicates whether the messages will be logged or not.

Now, what if you need to get a message logged which is not get logged automatically? Good example is, a message related to deadlock. If you check this message, you will see that its is_event_logged is not set to 1.

  1. SELECT * FROM sys.messages   
  2. WHERE language_id = 1033 AND severity = 13;  


If required, this can be enabled for logging. It can be done using sp_altermessage stored procedure. Here is the way of doing it.

EXECUTE sp_altermessage 1205, 'WITH_LOG', 'True';

Now it is enabled. If you run the first code again, result will be just like this:


Let's test this. Below code generates a deadlock.

  1. USE master;  
  2. GO  
  3.   
  4. -- Create a sample database  
  5. IF EXISTS (SELECT * FROM sys.databases WHERE name = 'DeadlockTestDB')  
  6.  DROP DATABASE DeadlockTestDB;  
  7. GO  
  8.   
  9. CREATE DATABASE DeadlockTestDB;  
  10. GO  
  11.   
  12.   
  13. -- Add two tables and insert two records  
  14. USE DeadlockTestDB;  
  15. GO  
  16.   
  17. CREATE TABLE Table01 (Id int PRIMARY KEYName varchar(200));  
  18. GO  
  19. INSERT INTO Table01 VALUES (1, 'Table01 value');  
  20. GO  
  21.   
  22. CREATE TABLE Table02 (Id int PRIMARY KEYName varchar(200));  
  23. GO  
  24. INSERT INTO Table02 VALUES (100, 'Table02 value');  
  25. GO  
  26.   
  27.   
  28. -- Create two stored procedures.  
  29. -- This one updates a record in Table01, waits,   
  30. -- and updates Table02.  
  31. CREATE PROC UpdateTable01andTable02  
  32. AS  
  33. BEGIN  
  34.   
  35.  BEGIN TRAN  
  36.   
  37.   UPDATE Table01  
  38.    SET Name = 'Updated'  
  39.   WHERE Id = 1;  
  40.   
  41.   WAITFOR DELAY '00:00:10';  
  42.   
  43.   UPDATE Table02  
  44.    SET Name = 'Updated'  
  45.   WHERE Id = 100;  
  46.   
  47.  COMMIT TRAN  
  48. END;  
  49. GO  
  50.   
  51. -- This one updates a record in Table02 and Table01  
  52. CREATE PROC UpdateTable02andTable01  
  53. AS  
  54. BEGIN  
  55.   
  56.  BEGIN TRAN  
  57.   
  58.   UPDATE Table02  
  59.    SET Name = 'Updated - 2'  
  60.   WHERE Id = 100;  
  61.   
  62.   
  63.   UPDATE Table01  
  64.    SET Name = 'Updated -2'  
  65.   WHERE Id = 1;  
  66.   
  67.  COMMIT TRAN  
  68. END;  
  69. GO  
  70.   
  71. /*  
  72.   
  73. -- Open a new connection and run this code  
  74. USE DeadlockTestDB;  
  75. GO  
  76.   
  77. EXEC UpdateTable01andTable02;  
  78. GO  
  79.   
  80. -- Open another connection and run this.  
  81. -- Make sure this run just after you start the above execution  
  82. USE DeadlockTestDB;  
  83. GO  
  84.   
  85. EXEC UpdateTable02andTable01;  
  86. GO  
  87. */  

Once you get the code executed and received the deadlock error, open the Event Viewer and see. You should see something like this;


Thursday, March 19, 2015

I changed the Recovery Model before BULK operation, is it a good practice?

I have already written a post on bulk operations, indicating the benefits we get by changing the recovery model before performing bulk operations. I got a couple offline questions after my SQL Server Stored Procedure session at SS SLUG and one question was this. I was talking about how plan cache getting flushed away and I mentioned that changing recovery model causes to get the plan cache flushed. Now the question is, should we change the recovery model for our bulk operation?

First of all, let's see whether changing the Recovery Model causes to get the plan cached removed. Here is the code I wrote for testing.

I am executing two SELECT statements, one against AdventureWorks2014 and another with AdventureWorksDW2014. The last statement checks plan cache.

  1. -- EXECUTE STATEMENT SEPARATELY  
  2.   
  3. -- Cleaning buffer  
  4. DBCC FREEPROCCACHE  
  5. DBCC DROPCLEANBUFFERS  
  6. GO  
  7.     
  8. USE AdventureWorks2014;  
  9. GO  
  10.   
  11. -- Query Sales.SalesOrderDetail table  
  12. SELECT * FROM Sales.SalesOrderDetail  
  13. WHERE SalesOrderDetailID = 1;  
  14.     
  15.   
  16. USE AdventureWorksDW2014;  
  17. GO  
  18.   
  19. -- Query dbo.DimProduct  
  20. SELECT * FROM dbo.DimProduct  
  21. WHERE ProductKey = 1;  
  22.   
  23. -- Checking chached plans  
  24. SELECT p.usecounts, p.cacheobjtype, p.objtype, t.text  
  25. FROM sys.dm_exec_cached_plans p  
  26. CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) t  
  27. WHERE   
  28.  TEXT LIKE '%SELECT * FROM Sales.SalesOrderDetail%'  
  29.  OR TEXT LIKE '%SELECT * FROM dbo.DimProduct%';  

As we expected, we see two cached plans in plan cache.



This code changes the Recovery Model of AdventureWorks2014. Second statement shows plans in the cache after the change.

  1. -- Change the Recovery Model  
  2. ALTER DATABASE AdventureWorks2014 SET RECOVERY Bulk_Logged;  
  3. GO  
  4.   
  5. -- And see plans  
  6. SELECT p.usecounts, p.cacheobjtype, p.objtype, t.text  
  7. FROM sys.dm_exec_cached_plans p  
  8. CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) t  
  9. WHERE   
  10.  TEXT LIKE '%SELECT * FROM Sales.SalesOrderDetail%'  
  11.  OR TEXT LIKE '%SELECT * FROM dbo.DimProduct%';  



As you see, plan related to AdventureWorks2014 has been removed. This proves that changing the Recovery Model flushes plans in the cache.

Now the question is, how advisable to do this in production environment for bulk operations. My recommendation is, if it is a large bulk operation and database is heavily accessed by users, bulk operation should be done during off-peak hours. We still lose cached plans but this will at least minimize the damage. However, if the database is continuously being used by users, then it is advisable to perform bulk operations without changing the Recovery Model.


Tuesday, March 17, 2015

How to fill the plan cache efficiently - Optimize for ad hoc workloads option

It is good to keep all plans used for compiled queries in the memory, as it helps to optimize the query processing, allowing engine to reuse the cached plan without recompiling query again for subsequent executions. But what if users frequently issue one-time-execute adhoc queries, and very rare to issue the same again? In that case, we should not waste the memory allocated to plans with them and should hold them only if they are executed again. How do we do it?

We can instruct SQL Server not to cache the plan for adhoc queries at the first execution but at the second execution using a server level setting. It is not recommend to change the default behavior but if we are opening the database for a similar situation, it may offer some performance benefit for other queries specifically with stored procedures.

This setting is called as optimize for ad hoc workloads. By default, it is disable, and it can be enabled passing 1 for this setting using sp_configure.

Here is the code for enabling it.

  1. sp_configure 'optimize for ad hoc workloads', 1;  
  2. GO  
  3. RECONFIGURE;  
  4. GO  

Now it is enabled. Let's run a query and see how it is cached.

  1. -- Cleaning buffer  
  2. DBCC FREEPROCCACHE  
  3. DBCC DROPCLEANBUFFERS  
  4. GO  
  5.    
  6. -- Simple adhoc query with a filter  
  7. SELECT * FROM Sales.SalesOrderDetail  
  8. WHERE SalesOrderDetailID = 1;  
  9.    
  10. -- Checking chached plans  
  11. -- We should see one record for above query  
  12. SELECT p.usecounts, p.cacheobjtype, p.objtype, t.text  
  13. FROM sys.dm_exec_cached_plans p  
  14. CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) t  
  15. WHERE p.usecounts > 0 AND  
  16. TEXT LIKE '%SELECT * FROM Sales.SalesOrderDetail%'  
  17. ORDER BY p.usecounts DESC;  


As you see, there is a record indicating that plan is cached but the type is different. It means that plan has not been fully cached and SQL Server will not be able to use it again for next execution. This small compiled plan stub helps to identify the query with next execution, and then plan is recreated and stored.

Execute the SELECT query again and see the plan cache.



It is cached. Execute again and see.



It is being reused now. This saves resources heavily but have to enable only if we are 100% sure that we always get dissimilar adhoc queries.

For more info on this setting, refer: https://msdn.microsoft.com/en-us/library/cc645587.aspx

Monday, March 16, 2015

Does SQL Server use same cached plan for same statement with subsequent execution?

When you send a request to SQL Server, it takes your statement via four main steps. Generating a plan for it is part of these four steps. These steps are called as: Parsing, Normalization, Compilation, and Optimization. Parsing makes sure the query syntax is correct and reforms the query for compilation. Normalization checks all the object that have been referenced in the query for existence. This is step that throws errors such as Invalid object, Object not found. Compilation generates multiple plans for the query and Optimization picks the best and places it into the buffer.

Now the question is, once the plan is generated for the query, will it be reused with subsequent execution. Most of the time, we expect the plan to be reused but it is not always the case. Here is an example for it;

This adhoc statement queries against Sales.SalesOrderDetail table in AdventureWorks2014, filtering SalesOrderDetailID to 1. Do not execute queries together, execute one by one without selecting comments.

  1. -- Cleaning buffer  
  2. DBCC FREEPROCCACHE  
  3. DBCC DROPCLEANBUFFERS  
  4. GO  
  5.   
  6. -- Simple adhoc query with a filter  
  7. SELECT * FROM Sales.SalesOrderDetail  
  8. WHERE SalesOrderDetailID = 1;  
  9.   
  10. -- Checking chached plans  
  11. -- We should see one record for above query  
  12. SELECT p.usecounts, p.cacheobjtype, p.objtype, t.text  
  13. FROM sys.dm_exec_cached_plans p  
  14. CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) t  
  15. WHERE p.usecounts > 0 AND  
  16. TEXT LIKE '%SELECT * FROM Sales.SalesOrderDetail%'  
  17. ORDER BY p.usecounts DESC;  


As you see we have one cached plan for the query executed. Let's run the query again with the same filter and with some different filter values. Again, run these queries separately.

  1. -- Executing the same   
  2. -- and with two different values  
  3. -- Run these SELECT separately  
  4. SELECT * FROM Sales.SalesOrderDetail  
  5. WHERE SalesOrderDetailID = 1;  
  6.   
  7. SELECT * FROM Sales.SalesOrderDetail  
  8. WHERE SalesOrderDetailID = 1000;  
  9.   
  10. SELECT * FROM Sales.SalesOrderDetail  
  11. WHERE SalesOrderDetailID = 100000;  
  12.   
  13. -- Check cached plans again  
  14. SELECT p.usecounts, p.cacheobjtype, p.objtype, t.text  
  15. FROM sys.dm_exec_cached_plans p  
  16. CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) t  
  17. WHERE p.usecounts > 0 AND  
  18. TEXT LIKE '%SELECT * FROM Sales.SalesOrderDetail%'  
  19. ORDER BY p.usecounts DESC;  


Can you see how the plan has been reused. SQL Server has reused the plan only for the first statement and for other statements, it has created new plans even though the query is same. How this can be happened?

This is not an issue or bug, this is how it compiles queries. When SQL Server receives a query, it checks the cache for plans by comparing the entire statement. As you see, even though the statement is same, it creates a new plan because statement does not match with previous one (because of the filter value). So, keep this in mind, if the query is lengthy and complex, the time it takes for executing will be always same when filter values are different, even though you expect it to be optimized by reusing the plan. This is one of reasons for recommending Stored Procedure for queries that are executed frequently with different filters.


Sunday, March 15, 2015

SQL Server Brain Basher of the Week #003

This week Brain Basher is based on data/time. See below code and try to guess the output of it.

  1. DECLARE @datetime datetime  
  2. SET @datetime = '01/01/00'  
  3. SELECT YEAR (@datetime)  
  4.   
  5. SET @datetime = '01/01/30'  
  6. SELECT YEAR (@datetime)  
  7.   
  8. SET @datetime = '01/01/40'  
  9. SELECT YEAR (@datetime)  
  10.   
  11. SET @datetime = '01/01/50'  
  12. SELECT YEAR (@datetime)  

What did you guess? Years of 2000, 2030, 2040, and 2050? If so, it is wrong. Here is the output;


As you see, the last one is not 2050, it is 1950. This is based on a setting called two digit year cutoff. The setting represents the cutoff year for interpreting two-digits year as four-digits year and default value for this is 2049. This sets the default time span as 1950-2049, hence value of 49 represents, 2049 and value of 50 represents 1950. Another example is, value of 99 represents 1999 and value of 01 represents 2001.

It is not advisable to change this but this setting can be changed using sp_configure.




Saturday, March 14, 2015

Can we get SQL Server service and Agent service automatically restarted at unexpected stopped?

We know that Windows Services have an option to get them started automatically configuring the Start Mode. But this is only with Windows starting. What if when either SQL Server or SQL Server Agent stops unexpectedly, you need to get them started automatically? 

This is possible. It can be simply set with SQL Server Agent property window as below.


** Note that, for the second option which is Auto restart SQL Server Agent if it stops unexpectedly needs SQL Server Agent Service Account to be a member of local admin group of installed server. However this is not considered as a good practice, hence external monitoring tool such as System Center Operations Manager is recommended for monitoring.

Friday, March 13, 2015

How to create a simple database-level audit using SQL Server Audit

How do you implement "Auditing" with your SQL Server database? Do you know that SQL Server Audit is the best and simplest way of implementing "Auditing" for your database? Unfortunately, we still see oldest, traditional implementations for auditing and "Trigger" is the most common way of implementing it though it does not support on SELECT queries. In addition to that custom code implementation is common too.

Although SQL Server Audit was introduced with SQL Server 2008, as I see, it is still not popular among database administrators (or more precisely, it is an unknown item for them). Auditing is required for many reasons, specially on security, hence actions performed by users regardless of the type of the action should be captured and logged. This post speaks about "SQL Server Audit" and how to implement it for capturing actions performed by users using TSQL though it can be easily implemented using GUIs given.

SQL Server Audit is the main auditing tool available with SQL Server. In addition to this, SQL Server offers few more ways for auditing such as C2, Common Criteria Compliance, Trigger, SQL Server Profiler and SQL Trace. However, they do not offer much facilities as SQL Server Audit does.

SQL Server Audit support both server-level and database-level auditing. All editions support server-level auditing bu only Enterprise, Developer and Evaluation editions support database-level auditing.

SQL Server Audit is based on Extended Events which is a lightweight eventing engine that has very little impact on the database being monitored. Extended events feature allows you to define an action for specific event. When SQL Server executes an internal code related to the event specified, it checks and sees whether an action has been set for it, if available, it fires and send details to the target. For more info on Extended Events, see this: https://msdn.microsoft.com/en-us/library/bb630282.aspx

Let's see how to create a simple audit using TSQL. In order to create an audit, following steps have to be performed;

  1. Create an Audit with name and target. Additional options such as ON-FAILURE can be set with this step too.
  2. Add Audit Specification for the created Audit. Either server or database specification can added.
  3. Add Actions or Action Groups to the created Audit Specification.
  4. Enable all added items.
That is all. Here is the code for implementing auditing on Production schema of the AdventureWorks database for SELECT action.

  1. -- Create a SQL Server Audit and define its   
  2. -- target as the windows application log  
  3.   
  4. USE master;  
  5. GO  
  6.   
  7. CREATE SERVER AUDIT AdventureWorksLog  
  8.  TO APPLICATION_LOG  
  9.  WITH (QUEUE_DELAY = 1000, ON_FAILURE = CONTINUE);  
  10. GO  
  11.   
  12.   
  13. -- Create database audit specification  
  14. -- for Production table in AdventureWorks  
  15.   
  16. USE AdventureWorks2014;  
  17. GO  
  18.   
  19. CREATE DATABASE AUDIT SPECIFICATION ProductionSpecification  
  20.  FOR SERVER AUDIT AdventureWorksLog  
  21.  ADD (SELECT ON SCHEMA::Production BY PUBLIC);  
  22. GO  
  23.   
  24. -- Query the sys.server_audits system view  
  25. -- Note the is_state_enabled  
  26.   
  27. SELECT * FROM sys.server_audits;  
  28. GO  
  29.   
  30. -- Enable the server audit and   
  31. -- AdventureWorks database audit specification  
  32.   
  33. USE master;  
  34. GO  
  35.   
  36. ALTER SERVER AUDIT AdventureWorksLog WITH (STATE = ON);  
  37. GO  
  38.   
  39. USE AdventureWorks2014;  
  40. GO  
  41.    
  42. ALTER DATABASE AUDIT SPECIFICATION ProductionSpecification  
  43.  WITH (STATE = ON);  
  44. GO  
  45.   
  46. -- Generate an auditable event by querying a table  
  47. -- in the Production schema. Also execute a query  
  48. -- that should not be audited  
  49.   
  50. SELECT * FROM Production.Product;  
  51. GO  
  52. SELECT * FROM Sales.Currency;  
  53. GO  

Once the SELECT statements are executed, let's have a look on Event Viewer and see. You should see the event related to the first statement logged.


For more info on SQL Server Audit, refer here: https://technet.microsoft.com/en-us/library/cc280505(v=sql.105).aspx

Thursday, March 12, 2015

Number of records in Non-Clustered Index Leaf Level is same as records in the table?

This is one of the questions I got in last week. To make it more clearer, the question is, when we have 1,000,000 records in a clustered-structure table, are we supposed to see 1,000,000 records in leaf level of all non-clustered indexes, even when index key holds duplicates?

Answer is yes. Here is simple code for showing it. This table has 7,301,921 records and leaf level of clustered index holds 7,301,921 records too.

  1. SELECT COUNT(*) FROM dbo.InternetSales;  
  2.   
  3. SELECT index_id, partition_number, index_type_desc, index_level, record_count  
  4. FROM sys.dm_db_index_physical_stats (DB_ID('SalesDB')  
  5.  ,OBJECT_ID('dbo.InternetSales'), NULLNULL'DETAILED');  


Now, let's create an index on OrderDateKey which contains duplicates.


Now if I execute the following code, it will show that number of records in both indexes are same.

  1. CREATE INDEX IX_Sales_OrderDateKey ON dbo.InternetSales (OrderDateKey);  
  2. GO  
  3.   
  4. SELECT index_id, partition_number, index_type_desc, index_level, record_count  
  5. FROM sys.dm_db_index_physical_stats (DB_ID('SalesDB')  
  6.  ,OBJECT_ID('dbo.InternetSales'), NULLNULL'DETAILED');  


Sunday, March 8, 2015

SQL Server Brain Basher of the Week #002

Here is the Brain Basher of this week. As you know, if you have multiple objects to be deleted, for an example, 3 tables, we write three statements as below.

  1. DROP TABLE dbo.T1;  
  2. DROP TABLE dbo.T2;  
  3. DROP TABLE dbo.T3;  

Can we write one statement or one command to do this?

Yes, it is possible, as long as they are similar type of objects. All we have to do is, have the command with object names separated with comma. Here is an example.

  1. DROP TABLE dbo.T1, dbo.T2, dbo.T3;  

Have a good week ahead!


Saturday, March 7, 2015

How to change the data type of a column of a large table

Here is another common question appeared in forums, how to change the data type of column, specifically a large table. If you search for it, the most common one is, handling this with a new column (which is my 3rd way in the code). However, situation to situation, things can be different, hence tested few ways of changing the type with a table that contains 7 millions records.

Here are 3 ways I tried with. The code uses 3 tables called dbo.Sales_1, dbo.Sales_2, and dbo.Sales_3. The structure and the number of records are same for all 3 tables. Here is the test I did.

  1. -- Method 1  
  2. -- ========  
  3.   
  4. -- Check the fragmentation of the table  
  5. SELECT * FROM sys.dm_db_index_physical_stats(db_id(), object_id('dbo.Sales_1'), nullnull'detailed');  
  6. -- Result  
  7. -- External Fragmentation - 0.3% (Good)  
  8. -- Internal Fragmentation - 98% (Good)  
  9.   
  10. -- Changing the column type of Product key from int to bigint  
  11. ALTER TABLE dbo.Sales_1  
  12.  ALTER COLUMN ProductKey bigint not null;  
  13. -- Result  
  14. -- Time spent: 4:47  
  15.   
  16. -- Checking fragmentation again  
  17. SELECT * FROM sys.dm_db_index_physical_stats(db_id(), object_id('dbo.Sales_1'), nullnull'detailed');  
  18. -- Result  
  19. -- External Fragmentation - 99% (Bad)  
  20. -- Internal Fragmentation - 51% (Bad)  
  21.   
  22. -- Rebulding the index  
  23. ALTER INDEX PK__Sales_1 ON dbo.Sales_1 REBUILD;  
  24. -- Result  
  25. -- Time spent: 3:43  
  26.   
  27. -- **  
  28. -- Total time spent: 8:30  
  29. -- **  
  30.   
  31.   
  32.   
  33. -- Method 2  
  34. -- ========  
  35.   
  36. -- Check the fragmentation of the table  
  37. SELECT * FROM sys.dm_db_index_physical_stats(db_id(), object_id('dbo.Sales_2'), nullnull'detailed');  
  38. -- Result  
  39. -- External Fragmentation - 0.3% (Good)  
  40. -- Internal Fragmentation - 98% (Good)  
  41.   
  42. -- Rebuilding the index with filfactor 80.  
  43. ALTER INDEX PK__Sales_2 ON dbo.Sales_2 REBUILD WITH (FILLFACTOR = 80);  
  44. GO  
  45. -- Result  
  46. -- Time spent: 1:38  
  47.   
  48. -- Changing the column type of Product key from int to bigint  
  49. ALTER TABLE dbo.Sales_2  
  50.  ALTER COLUMN ProductKey bigint not null  
  51. -- Result  
  52. -- Time spent: 1:13  
  53.   
  54. -- Check for fragmentation  
  55. SELECT * FROM sys.dm_db_index_physical_stats(db_id(), object_id('dbo.Sales_2'), nullnull'detailed');  
  56. -- Result  
  57. -- External Fragmentation - 0.1% (Good)  
  58. -- Internal Fragmentation - 85% (Good)  
  59.   
  60. -- **  
  61. -- Total time spent: 2:51  
  62. -- **  
  63.   
  64.   
  65.   
  66. -- Method 3  
  67. -- ========  
  68.   
  69. -- Check the fragmentation of the table  
  70. SELECT * FROM sys.dm_db_index_physical_stats(db_id(), object_id('dbo.Sales_3'), nullnull'detailed');  
  71. -- Result  
  72. -- External Fragmentation - 0.3% (Good)  
  73. -- Internal Fragmentation - 98% (Good)  
  74.   
  75. -- Add a new column with required data type: bigint  
  76. -- Setting it as a nullable column  
  77.  ALTER TABLE dbo.Sales_3  
  78.  ADD NewProductKey bigint null  
  79.   
  80. -- Changing the recovery model  
  81. ALTER DATABASE TestDatabase SET RECOVERY Bulk_Logged;  
  82.   
  83. -- Updating the new column with old column value  
  84. DECLARE @Count int = 1  
  85.   
  86. WHILE (@Count != 0)  
  87. BEGIN  
  88.   
  89.  UPDATE TOP (500000) dbo.Sales_3 WITH (TABLOCK)  
  90.   SET NewProductKey = ProductKey  
  91.  WHERE NewProductKey IS NULL  
  92.   
  93.  SET @Count = @@ROWCOUNT  
  94. END  
  95. -- Result  
  96. -- Time spent: 1:23  
  97.   
  98. -- Drop the old column  
  99. ALTER TABLE dbo.Sales_3  
  100.  DROP COLUMN ProductKey  
  101.   
  102. -- Rename the new column as old column  
  103. sp_rename 'dbo.Sales_3.NewProductKey''ProductKey''COLUMN';  
  104.   
  105. -- Alter the current column as non-nullable  
  106. ALTER TABLE dbo.Sales_3  
  107.  ALTER COLUMN ProductKey bigint NOT NULL  
  108. -- Result  
  109. -- Time spent: 0.03  
  110.   
  111. -- Check the fragmentation of the table  
  112. SELECT * FROM sys.dm_db_index_physical_stats(db_id(), object_id('dbo.Sales_3'), nullnull'detailed');  
  113. -- Result  
  114. -- External Fragmentation - 99% (Bad)  
  115. -- Internal Fragmentation - 51% (Bad)  
  116.   
  117.   
  118. ALTER INDEX PK__Sales_3 ON dbo.Sales_4 REBUILD;  
  119. -- Result  
  120. -- Time spent: 2:55  
  121.   
  122. -- Change back the recovery model  
  123. ALTER DATABASE TestDatabase SET RECOVERY Full;  
  124.   
  125. -- **  
  126. -- Total time spent: 4:21  
  127. -- **  

Here is the summary of it;
Method Time Spent Comments
1 - Just alter the column8:30 Takes long time
2 - Alter the column with fill-factor 2:51 Fastest, Probably the best
3 - With a new column 4:21 Good, but need more extra work

My preference is for the highlighted one as it gives the best performance and I do not see any issue with it.

Thursday, March 5, 2015

How to load/insert large amount of data efficiently

Importing data from other sources is a very common operation with most of database solutions as not all data cannot be entered row-by-row. When large amount of data needs to be imported, we always consider about the constraints added to the table and minimal logging expecting improved performance.

SQL Server supports a set of tools for importing data: Import and Export Wizard, bcp (Bulk Copy Program), BULK INSERT, OPENROWSET (BULK). Preference always goes to either bcp or BULK INSERT, however, to improve the performance of it, some of the options have to be considered and set. This post explains how to load a large amount of data using bcp with various options.

For testing the best way, I created a database called TestDatabase and a table called dbo.Sales. I prepared a text file called Sales.txt that has 7,301,921 records matching with dbo.Sales table structure. The Recovery Model of the database is initial set as Full.

Then I loaded the table with following bcp commands. Note that each and every command was run after truncating and shrinking the log file.

  1. --Loding without any specific option  
  2. bcp TestDatabase.dbo.Sales in C:\Temp\Sales.txt -S DINESH-PC\SQL2014,8527 -T -c  
  3.   
  4. --Loding with TABLOACK option for forcing to lock the entire table  
  5. bcp TestDatabase.dbo.Sales in C:\Temp\Sales.txt -S DINESH-PC\SQL2014,8527 -T -c -h "TABLOCK"  
  6.   
  7. --Loding with our own batch size  
  8. bcp TestDatabase.dbo.Sales in C:\Temp\Sales.txt -S DINESH-PC\SQL2014,8527 -T -c -b 3000000   
  9.   
  10. --Loding with our own batch size and TABLOCK   
  11. bcp TestDatabase.dbo.Sales in C:\Temp\Sales.txt -S DINESH-PC\SQL2014,8527 -T -c -b 3000000 -h "TABLOCK"  

Once the table is loaded four times, then the table was loaded again with Bulk Logged recovery model. All commands were executed just like previous one, making sure that log is truncated and shrunk before the execution.

Finally here is the result.

Command  Recovery Model Time (ms) Log file size after loading
With no specific option Full 189,968 3,164,032 KB
With TABLOCK Full 275,500 1,475,904 KB
With batchsize Full 108,500  2,377,088 KB
With batchsize and TABLOCK Full 99,609 1,964,480 KB
With no specific option  Bulk Logged 140,422 3,164,032 KB
With TABLOCK Bulk Logged 239,938 26,816 KB
With batchsize Bulk Logged 121,828 2,377,088 KB
With batchsize and TABLOCK Bulk Logged 86,422 1,475,904 KB

As you see, you can get improved performance in terms of the speed by setting the batch size and tablock with bulk-logged recovery model. If you really consider about log file growth, then tablock option with bulk-logged recovery model is the best.

I believe that BULK INSERT offers the same result, however it is yet to be tested in similar way.

Wednesday, March 4, 2015

SSRS Multi Value Parameters with MDX and checking whether all values are selected

It is a very common question in many forums: How can we check whether all values in the parameter are selected and how to add selected values to a dynamic query that will be used for loading the main dataset. Once I wrote something similar to SQL, here is the link if you need it: http://dinesql.blogspot.com/2011/09/reporting-services-parameters-adding.html

What if you need to generate a dynamic MDX statement for your main dataset and need to get values from a parameter which is formed using a dimension, in to your query. I have seen many different implementations on this: Some have used another hidden parameter for holding the same dataset, and use it for comparing the selection using Iif function for getting the right set. Some have used CountRows functions for comparing number of records in the dataset with number of selected values in the parameter. Most of methods are working fine but there is another easy way of doing it.

If you have built the MDX for the main dataset using the designer, then you know how parameters are handled with it. Designer uses StrToSet MDX function for getting values for multi value parameters. This is the easiest way for handling it and it can be used with dynamic MDX statement too.

Here is an example:

Assume that parameters are loaded using your own query. In that case, you need to use a code like below for loading dimension members.

  1. // There are multiple ways of loading values to parameters  
  2. // Either one is okay  
  3.   
  4. // 1.  
  5. // This query returns two columns; one for value and one for label  
  6. WITH   
  7.  MEMBER [Measures].[ParameterLabel] AS [Dim PoP].[PoP].CurrentMember.Member_Caption   
  8.  MEMBER [Measures].[ParameterValue] AS [Dim PoP].[PoP].CurrentMember.UniqueName  
  9. SELECT   
  10.  {[Measures].[ParameterLabel], [Measures].[ParameterValue]} ON 0  
  11.  , [Dim PoP].[PoP].[Pop].Members ON 1  
  12. FROM [OLAP_Voice];  
  13. GO  
  14.   
  15.   
  16. // 2.  
  17. // This query returns only one column  
  18. // You need to add two additional calculated fields in the dataset  
  19. // 1. Name - ParameterLabel, Value - =Fields!Product.Value  
  20. // 1. Name - ParameterValue, Value - =Fields!Product.UniqueName  
  21.   
  22. SELECT {} ON 0  
  23.  , [Dim PoP].[PoP].[Pop].Members   
  24.  DIMENSION PROPERTIES MEMBER_CAPTION, MEMBER_UNIQUE_NAME on 1  
  25. FROM OLAP_Voice;  

Now if you need to check and see whether all values are selected, multiple values are selected or, one value is selected and then use selected one(s) with your main query, without using Iif function, StrToSet can be easily used . This requires two steps;

1. Add a parameter to the main dataset like below;

2. Include the parameter with StrToSet function where you need to have values from the parameter.

  1. " WITH " &   
  2. "  MEMBER [Measures].[Measure1] " &   
  3. "   AS ..... " &   
  4. "  MEMBER [Measures].[Measure2] " &   
  5. "   AS ..... " &   
  6.   
  7. "  SET [MainSet] " &  
  8. "   AS NonEmptyCrossjoin({[Dim Geography].[Country].[Country].Members} " &  
  9. "     , StrToSet(@Product) " & // no need to check whether one or many have been selected  
  10. "    , { [Dim Department].[Department].[Department].Members} " &  
  11.   
  12.   
  13. " SELECT " &   
  14. "  {[Measures].[Measure1], [Measures].[Measure2]} ON 0 " &   
  15. "  , {MainSet} ON 1 " &   
  16.   
  17. " FROM   Cube;"  


For more info on StrToSet, read this: https://msdn.microsoft.com/en-us/library/ms144782.aspx




Monday, March 2, 2015

Free Microsoft SQL Server preparation courses at Virtual Academy for exams 464 and 465

Microsoft Virtual Academy has added two new preparation courses for SQL Server exams. Theses two courses cover Developing Microsoft SQL Server Databases (464) and Designing Solutions for SQL Server (465). If you do not like to read articles or books, here is the way of learning areas related to these two exams, all courses are built with videos :).

  1. Developing Microsoft SQL Server Database (464) - 6 modules, approximately 7 hours.
  2. Designing Database Solutions for SQL Server (465) - 5 modules, approximately 6 hours.
In addition to these two, few more preparation courses related to other SQL Server exams are available;

Sunday, March 1, 2015

SQL Server Brain Basher of the Week #001

I have been doing a series of session called SQL Server Brain Basher in SQL Server Sri Lanka User Group (https://www.facebook.com/groups/sqlserveruniverse/) that explains some tricky questions which we cannot answer immediately, unless you have excelled the area related. Thought to have at least one I used before (and planning to use in future) as a blog post in every week, hence, here is the first one which is fairly an easy one.

What would be result of following query?

  1. SELECT 100, 200 '300';  

If you are familiar with adding aliases to columns, you can immediately answer. This query returns two columns; first column with value of 100 without a column name, and second with a value of 200 with a column name of 300.



Column alias can be used to relabel column if required. This is really useful if you need to have a different name for the column without having the one coming from the table and name calculated columns. There are multiple ways of adding column aliases, here are some of the ways;

  1. -- Column alias using AS keyword  
  2. SELECT 100 AS Column1, 200 AS 'Column2';  
  3.   
  4. -- Column alias with an Equal sign  
  5. SELECT Column1 = 100, 'Column2' = 200;  
  6.   
  7. -- Column alias following column name  
  8. SELECT 100 Column1, 200 'Column2';