Thursday, March 26, 2015

My Application needs a new sequence number to the table without inserting the record

This is something I discussed with one of my colleagues, it was all about obtaining the next sequence number from Identity type column without inserting the record and use it for other entities inside the application, and later submit all to the database for insertion. However, if this is the requirement, it is not possible with Identity because the next value cannot be obtained from Identity-Column without inserting the record. This issue can be seen often with Entity Framwork enabled applications.

What would be the best way of handling this? Traditional way of handling this is, maintaining another table that holds key values for tables and using it for issuing next available values. It needs additional coding, extra effort and it introduces unexpected locking issues too. Modern way and best way of handling this is, using SEQUENCE object which was introduced with SQL Server 2012.

The SEQUENCE is a user-defined schema-bound object that generates sequence numbers based on the setting added. Multiple properties can be set with this such as Starting number, interval, and order. Unlike Identity, defined SEQUENCE object can be used with any table because there is no link between SEQUENCE object and a table. Although one SEQUENCE object can be used with all tables, it is advisable to create multiple SEQUENCE objects, one per table because it makes the maintainability easy and minimizes gaps between numbers added to tables.

Here is the way of creating a SEQUENCE object.

CREATE SEQUENCE Sales.OrderSequence
    START WITH 1
    INCREMENT BY 1;
GO

Once created, application can request for the next number which can be used as key of the table, calling below command;

SELECT NEXT VALUE FOR Sales.OrderSequence;

For more info on SEQUENCE object, refer: https://msdn.microsoft.com/en-us/library/ff878091.aspx



Tuesday, March 24, 2015

I declared a nvarchar(max) variable but it holds only 4000 characters

Although getting questions from SQL Server enthusiasts makes me busy, this brushes up my knowledge significantly. This is one of the questions I got today.

If the TSQL statement has to be dynamically built, the most common approach we use for forming the statement is, declaring a variable with either varchar(max) or nvarchar(max) data type and adding necessary string values for forming the statement. If you are a person who always maintain strict discipline on coding format, then you will be properly forming the statement but if you are bit lazy and want to form the statement without considering the format much, you may face the issue mentioned with the title.

Let me explain how this can happen. Have a look on below code.

DECLARE @Sql nvarchar(max)

SET @Sql = N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,  This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters'
   
SELECT LEN(@Sql);
-- Result - 5111

If you execute the code, you get the result as 5111. It shows all characters added to the @Sql variable. Let's inject an additional character between these lines (That is how we form the dynamic SQL statement).

DECLARE @Sql nvarchar(max)

SET @Sql = N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, '
   
   + N' ' +

   N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,  This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters'
   
SELECT LEN(@Sql);
-- Result - 4000

If you execute the code now, the result is not as we expected, it shows the count as 4000, instead of 5112. Obviously, reason is truncation but how it happened is the question. 

The reason is completely based on logic used with string concatenation. With our second code, we concatenate three Unicode literals. When literals or expressions are concatenated, if the size of it is less than 4000 (it is 8000 for non-Unicode), then it is treated as nvarchar(Nof Characters). If the value contains more than 4000 characters, then it is treated as nvarchar(max). Considering this logic, all our three literals are treated as nvarchar(Nof Characters), hence concatenation will be something like below;

@SQL = nvarchar(n) + nvarchar(n) + nvarchar(n)

Since no literal has been treated as nvarchar(max), the result of this will be limited 4000,  as a result of truncation.

How do we overcome this situation? There are two easy ways of writing this properly. One is, contatenate string using SET statement. This makes sure that we always concatenate the literal with nvarchar(max), hence result is nvarchar(max). The second way is, convert the literal explicitly to nvarchar(max) before concatenating. Here is the sample code.

DECLARE @Sql nvarchar(max)

SET @Sql = N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, '
   
SET @Sql += N' ' 

SET @Sql += N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,  This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters'
   
SELECT LEN(@Sql);
-- Result - 5107
GO

DECLARE @Sql nvarchar(max)

SET @Sql = CONVERT(nvarchar(max), N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, ')
   
   + CONVERT(nvarchar(max), N' ') +

   CONVERT(nvarchar(max), N'This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, 
   This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters,  This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters, This value contains 5111 characters')
   
SELECT LEN(@Sql);
-- Result - 5107

Now you know the reason for this, so, do not make this mistake when forming the statement dynamically.

Sunday, March 22, 2015

SQL Server Brain Basher of the Week #004

Here is another tricky question? It is on Indexing.

Can we create an index without mentioning what column(s) to be used?

Obviously, we have to expect the answer as No, because index requires one or more columns for organizing the index; the index key. But this is possible with SQL Server 2014 :).

How is this possible? Earlier we had two types of indexes; Clustered Index and Non-Clustered Index. With SQL Server 2012, a new classification was introduced; Rowstore Index and Columnstore Index. Rowstore index represents the indexes we used to create with rows (or records) and Columnstore index represents a new way of creating indexes on columns. Only non-clustered columnstore indexes were supported with SQL Server 2012 which require column(s) to be specified.

Though 2012 does not support creating clustered columnstore indexes, SQL Server 2014 allows creating clustered columnstore indexes with tables. Once the table is set with a clustered columnstore index, no other indexes are possible with the same table. Clustered columnstore index does NOT require a column (or columns) to be specified when creating, hence we can create an index without specifying columns. So, the answer for above question is Yes.

Here is an example for Clustered Columnstore Index.

CREATE CLUSTERED COLUMNSTORE INDEX IX_Sales
 ON Sales;



Saturday, March 21, 2015

How to get system/error messages with severity level lower than 19 logged to Application Log

Have you noticed that SQL Server does not send all messages to Application Log? The default setting does not send all messages, generally messages with severity level between 19 and 25 are automatically written but not all. The best way to check which messages that are getting logged, is checking sys.messages catalog view. This has all messages and the column is_event_logged indicates whether the messages will be logged or not.

Now, what if you need to get a message logged which is not get logged automatically? Good example is, a message related to deadlock. If you check this message, you will see that its is_event_logged is not set to 1.

SELECT * FROM sys.messages 
WHERE language_id = 1033 AND severity = 13;


If required, this can be enabled for logging. It can be done using sp_altermessage stored procedure. Here is the way of doing it.

EXECUTE sp_altermessage 1205, 'WITH_LOG', 'True';

Now it is enabled. If you run the first code again, result will be just like this:


Let's test this. Below code generates a deadlock.

USE master;
GO

-- Create a sample database
IF EXISTS (SELECT * FROM sys.databases WHERE name = 'DeadlockTestDB')
 DROP DATABASE DeadlockTestDB;
GO

CREATE DATABASE DeadlockTestDB;
GO


-- Add two tables and insert two records
USE DeadlockTestDB;
GO

CREATE TABLE Table01 (Id int PRIMARY KEY, Name varchar(200));
GO
INSERT INTO Table01 VALUES (1, 'Table01 value');
GO

CREATE TABLE Table02 (Id int PRIMARY KEY, Name varchar(200));
GO
INSERT INTO Table02 VALUES (100, 'Table02 value');
GO


-- Create two stored procedures.
-- This one updates a record in Table01, waits, 
-- and updates Table02.
CREATE PROC UpdateTable01andTable02
AS
BEGIN

 BEGIN TRAN

  UPDATE Table01
   SET Name = 'Updated'
  WHERE Id = 1;

  WAITFOR DELAY '00:00:10';

  UPDATE Table02
   SET Name = 'Updated'
  WHERE Id = 100;

 COMMIT TRAN
END;
GO

-- This one updates a record in Table02 and Table01
CREATE PROC UpdateTable02andTable01
AS
BEGIN

 BEGIN TRAN

  UPDATE Table02
   SET Name = 'Updated - 2'
  WHERE Id = 100;


  UPDATE Table01
   SET Name = 'Updated -2'
  WHERE Id = 1;

 COMMIT TRAN
END;
GO

/*

-- Open a new connection and run this code
USE DeadlockTestDB;
GO

EXEC UpdateTable01andTable02;
GO

-- Open another connection and run this.
-- Make sure this run just after you start the above execution
USE DeadlockTestDB;
GO

EXEC UpdateTable02andTable01;
GO
*/

Once you get the code executed and received the deadlock error, open the Event Viewer and see. You should see something like this;


Thursday, March 19, 2015

I changed the Recovery Model before BULK operation, is it a good practice?

I have already written a post on bulk operations, indicating the benefits we get by changing the recovery model before performing bulk operations. I got a couple offline questions after my SQL Server Stored Procedure session at SS SLUG and one question was this. I was talking about how plan cache getting flushed away and I mentioned that changing recovery model causes to get the plan cache flushed. Now the question is, should we change the recovery model for our bulk operation?

First of all, let's see whether changing the Recovery Model causes to get the plan cached removed. Here is the code I wrote for testing.

I am executing two SELECT statements, one against AdventureWorks2014 and another with AdventureWorksDW2014. The last statement checks plan cache.

-- EXECUTE STATEMENT SEPARATELY

-- Cleaning buffer
DBCC FREEPROCCACHE
DBCC DROPCLEANBUFFERS
GO
  
USE AdventureWorks2014;
GO

-- Query Sales.SalesOrderDetail table
SELECT * FROM Sales.SalesOrderDetail
WHERE SalesOrderDetailID = 1;
  

USE AdventureWorksDW2014;
GO

-- Query dbo.DimProduct
SELECT * FROM dbo.DimProduct
WHERE ProductKey = 1;

-- Checking chached plans
SELECT p.usecounts, p.cacheobjtype, p.objtype, t.text
FROM sys.dm_exec_cached_plans p
CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) t
WHERE 
 TEXT LIKE '%SELECT * FROM Sales.SalesOrderDetail%'
 OR TEXT LIKE '%SELECT * FROM dbo.DimProduct%';

As we expected, we see two cached plans in plan cache.



This code changes the Recovery Model of AdventureWorks2014. Second statement shows plans in the cache after the change.

-- Change the Recovery Model
ALTER DATABASE AdventureWorks2014 SET RECOVERY Bulk_Logged;
GO

-- And see plans
SELECT p.usecounts, p.cacheobjtype, p.objtype, t.text
FROM sys.dm_exec_cached_plans p
CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) t
WHERE 
 TEXT LIKE '%SELECT * FROM Sales.SalesOrderDetail%'
 OR TEXT LIKE '%SELECT * FROM dbo.DimProduct%';



As you see, plan related to AdventureWorks2014 has been removed. This proves that changing the Recovery Model flushes plans in the cache.

Now the question is, how advisable to do this in production environment for bulk operations. My recommendation is, if it is a large bulk operation and database is heavily accessed by users, bulk operation should be done during off-peak hours. We still lose cached plans but this will at least minimize the damage. However, if the database is continuously being used by users, then it is advisable to perform bulk operations without changing the Recovery Model.


Tuesday, March 17, 2015

How to fill the plan cache efficiently - Optimize for ad hoc workloads option

It is good to keep all plans used for compiled queries in the memory, as it helps to optimize the query processing, allowing engine to reuse the cached plan without recompiling query again for subsequent executions. But what if users frequently issue one-time-execute adhoc queries, and very rare to issue the same again? In that case, we should not waste the memory allocated to plans with them and should hold them only if they are executed again. How do we do it?

We can instruct SQL Server not to cache the plan for adhoc queries at the first execution but at the second execution using a server level setting. It is not recommend to change the default behavior but if we are opening the database for a similar situation, it may offer some performance benefit for other queries specifically with stored procedures.

This setting is called as optimize for ad hoc workloads. By default, it is disable, and it can be enabled passing 1 for this setting using sp_configure.

Here is the code for enabling it.

sp_configure 'optimize for ad hoc workloads', 1;
GO
RECONFIGURE;
GO

Now it is enabled. Let's run a query and see how it is cached.

-- Cleaning buffer
DBCC FREEPROCCACHE
DBCC DROPCLEANBUFFERS
GO
 
-- Simple adhoc query with a filter
SELECT * FROM Sales.SalesOrderDetail
WHERE SalesOrderDetailID = 1;
 
-- Checking chached plans
-- We should see one record for above query
SELECT p.usecounts, p.cacheobjtype, p.objtype, t.text
FROM sys.dm_exec_cached_plans p
CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) t
WHERE p.usecounts > 0 AND
TEXT LIKE '%SELECT * FROM Sales.SalesOrderDetail%'
ORDER BY p.usecounts DESC;


As you see, there is a record indicating that plan is cached but the type is different. It means that plan has not been fully cached and SQL Server will not be able to use it again for next execution. This small compiled plan stub helps to identify the query with next execution, and then plan is recreated and stored.

Execute the SELECT query again and see the plan cache.



It is cached. Execute again and see.



It is being reused now. This saves resources heavily but have to enable only if we are 100% sure that we always get dissimilar adhoc queries.

For more info on this setting, refer: https://msdn.microsoft.com/en-us/library/cc645587.aspx

Monday, March 16, 2015

Does SQL Server use same cached plan for same statement with subsequent execution?

When you send a request to SQL Server, it takes your statement via four main steps. Generating a plan for it is part of these four steps. These steps are called as: Parsing, Normalization, Compilation, and Optimization. Parsing makes sure the query syntax is correct and reforms the query for compilation. Normalization checks all the object that have been referenced in the query for existence. This is step that throws errors such as Invalid object, Object not found. Compilation generates multiple plans for the query and Optimization picks the best and places it into the buffer.

Now the question is, once the plan is generated for the query, will it be reused with subsequent execution. Most of the time, we expect the plan to be reused but it is not always the case. Here is an example for it;

This adhoc statement queries against Sales.SalesOrderDetail table in AdventureWorks2014, filtering SalesOrderDetailID to 1. Do not execute queries together, execute one by one without selecting comments.

-- Cleaning buffer
DBCC FREEPROCCACHE
DBCC DROPCLEANBUFFERS
GO

-- Simple adhoc query with a filter
SELECT * FROM Sales.SalesOrderDetail
WHERE SalesOrderDetailID = 1;

-- Checking chached plans
-- We should see one record for above query
SELECT p.usecounts, p.cacheobjtype, p.objtype, t.text
FROM sys.dm_exec_cached_plans p
CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) t
WHERE p.usecounts > 0 AND
TEXT LIKE '%SELECT * FROM Sales.SalesOrderDetail%'
ORDER BY p.usecounts DESC;


As you see we have one cached plan for the query executed. Let's run the query again with the same filter and with some different filter values. Again, run these queries separately.

-- Executing the same 
-- and with two different values
-- Run these SELECT separately
SELECT * FROM Sales.SalesOrderDetail
WHERE SalesOrderDetailID = 1;

SELECT * FROM Sales.SalesOrderDetail
WHERE SalesOrderDetailID = 1000;

SELECT * FROM Sales.SalesOrderDetail
WHERE SalesOrderDetailID = 100000;

-- Check cached plans again
SELECT p.usecounts, p.cacheobjtype, p.objtype, t.text
FROM sys.dm_exec_cached_plans p
CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) t
WHERE p.usecounts > 0 AND
TEXT LIKE '%SELECT * FROM Sales.SalesOrderDetail%'
ORDER BY p.usecounts DESC;


Can you see how the plan has been reused. SQL Server has reused the plan only for the first statement and for other statements, it has created new plans even though the query is same. How this can be happened?

This is not an issue or bug, this is how it compiles queries. When SQL Server receives a query, it checks the cache for plans by comparing the entire statement. As you see, even though the statement is same, it creates a new plan because statement does not match with previous one (because of the filter value). So, keep this in mind, if the query is lengthy and complex, the time it takes for executing will be always same when filter values are different, even though you expect it to be optimized by reusing the plan. This is one of reasons for recommending Stored Procedure for queries that are executed frequently with different filters.


Sunday, March 15, 2015

SQL Server Brain Basher of the Week #003

This week Brain Basher is based on data/time. See below code and try to guess the output of it.

DECLARE @datetime datetime
SET @datetime = '01/01/00'
SELECT YEAR (@datetime)

SET @datetime = '01/01/30'
SELECT YEAR (@datetime)

SET @datetime = '01/01/40'
SELECT YEAR (@datetime)

SET @datetime = '01/01/50'
SELECT YEAR (@datetime)

What did you guess? Years of 2000, 2030, 2040, and 2050? If so, it is wrong. Here is the output;


As you see, the last one is not 2050, it is 1950. This is based on a setting called two digit year cutoff. The setting represents the cutoff year for interpreting two-digits year as four-digits year and default value for this is 2049. This sets the default time span as 1950-2049, hence value of 49 represents, 2049 and value of 50 represents 1950. Another example is, value of 99 represents 1999 and value of 01 represents 2001.

It is not advisable to change this but this setting can be changed using sp_configure.




Saturday, March 14, 2015

Can we get SQL Server service and Agent service automatically restarted at unexpected stopped?

We know that Windows Services have an option to get them started automatically configuring the Start Mode. But this is only with Windows starting. What if when either SQL Server or SQL Server Agent stops unexpectedly, you need to get them started automatically? 

This is possible. It can be simply set with SQL Server Agent property window as below.


** Note that, for the second option which is Auto restart SQL Server Agent if it stops unexpectedly needs SQL Server Agent Service Account to be a member of local admin group of installed server. However this is not considered as a good practice, hence external monitoring tool such as System Center Operations Manager is recommended for monitoring.

Friday, March 13, 2015

How to create a simple database-level audit using SQL Server Audit

How do you implement "Auditing" with your SQL Server database? Do you know that SQL Server Audit is the best and simplest way of implementing "Auditing" for your database? Unfortunately, we still see oldest, traditional implementations for auditing and "Trigger" is the most common way of implementing it though it does not support on SELECT queries. In addition to that custom code implementation is common too.

Although SQL Server Audit was introduced with SQL Server 2008, as I see, it is still not popular among database administrators (or more precisely, it is an unknown item for them). Auditing is required for many reasons, specially on security, hence actions performed by users regardless of the type of the action should be captured and logged. This post speaks about "SQL Server Audit" and how to implement it for capturing actions performed by users using TSQL though it can be easily implemented using GUIs given.

SQL Server Audit is the main auditing tool available with SQL Server. In addition to this, SQL Server offers few more ways for auditing such as C2, Common Criteria Compliance, Trigger, SQL Server Profiler and SQL Trace. However, they do not offer much facilities as SQL Server Audit does.

SQL Server Audit support both server-level and database-level auditing. All editions support server-level auditing bu only Enterprise, Developer and Evaluation editions support database-level auditing.

SQL Server Audit is based on Extended Events which is a lightweight eventing engine that has very little impact on the database being monitored. Extended events feature allows you to define an action for specific event. When SQL Server executes an internal code related to the event specified, it checks and sees whether an action has been set for it, if available, it fires and send details to the target. For more info on Extended Events, see this: https://msdn.microsoft.com/en-us/library/bb630282.aspx

Let's see how to create a simple audit using TSQL. In order to create an audit, following steps have to be performed;

  1. Create an Audit with name and target. Additional options such as ON-FAILURE can be set with this step too.
  2. Add Audit Specification for the created Audit. Either server or database specification can added.
  3. Add Actions or Action Groups to the created Audit Specification.
  4. Enable all added items.
That is all. Here is the code for implementing auditing on Production schema of the AdventureWorks database for SELECT action.

-- Create a SQL Server Audit and define its 
-- target as the windows application log

USE master;
GO

CREATE SERVER AUDIT AdventureWorksLog
 TO APPLICATION_LOG
 WITH (QUEUE_DELAY = 1000, ON_FAILURE = CONTINUE);
GO


-- Create database audit specification
-- for Production table in AdventureWorks

USE AdventureWorks2014;
GO

CREATE DATABASE AUDIT SPECIFICATION ProductionSpecification
 FOR SERVER AUDIT AdventureWorksLog
 ADD (SELECT ON SCHEMA::Production BY PUBLIC);
GO

-- Query the sys.server_audits system view
-- Note the is_state_enabled

SELECT * FROM sys.server_audits;
GO

-- Enable the server audit and 
-- AdventureWorks database audit specification

USE master;
GO

ALTER SERVER AUDIT AdventureWorksLog WITH (STATE = ON);
GO

USE AdventureWorks2014;
GO
 
ALTER DATABASE AUDIT SPECIFICATION ProductionSpecification
 WITH (STATE = ON);
GO

-- Generate an auditable event by querying a table
-- in the Production schema. Also execute a query
-- that should not be audited

SELECT * FROM Production.Product;
GO
SELECT * FROM Sales.Currency;
GO

Once the SELECT statements are executed, let's have a look on Event Viewer and see. You should see the event related to the first statement logged.


For more info on SQL Server Audit, refer here: https://technet.microsoft.com/en-us/library/cc280505(v=sql.105).aspx

Thursday, March 12, 2015

Number of records in Non-Clustered Index Leaf Level is same as records in the table?

This is one of the questions I got in last week. To make it more clearer, the question is, when we have 1,000,000 records in a clustered-structure table, are we supposed to see 1,000,000 records in leaf level of all non-clustered indexes, even when index key holds duplicates?

Answer is yes. Here is simple code for showing it. This table has 7,301,921 records and leaf level of clustered index holds 7,301,921 records too.

SELECT COUNT(*) FROM dbo.InternetSales;

SELECT index_id, partition_number, index_type_desc, index_level, record_count
FROM sys.dm_db_index_physical_stats (DB_ID('SalesDB')
 ,OBJECT_ID('dbo.InternetSales'), NULL, NULL, 'DETAILED');


Now, let's create an index on OrderDateKey which contains duplicates.


Now if I execute the following code, it will show that number of records in both indexes are same.

CREATE INDEX IX_Sales_OrderDateKey ON dbo.InternetSales (OrderDateKey);
GO

SELECT index_id, partition_number, index_type_desc, index_level, record_count
FROM sys.dm_db_index_physical_stats (DB_ID('SalesDB')
 ,OBJECT_ID('dbo.InternetSales'), NULL, NULL, 'DETAILED');


Sunday, March 8, 2015

SQL Server Brain Basher of the Week #002

Here is the Brain Basher of this week. As you know, if you have multiple objects to be deleted, for an example, 3 tables, we write three statements as below.

DROP TABLE dbo.T1;
DROP TABLE dbo.T2;
DROP TABLE dbo.T3;

Can we write one statement or one command to do this?

Yes, it is possible, as long as they are similar type of objects. All we have to do is, have the command with object names separated with comma. Here is an example.

DROP TABLE dbo.T1, dbo.T2, dbo.T3;

Have a good week ahead!


Saturday, March 7, 2015

How to change the data type of a column of a large table

Here is another common question appeared in forums, how to change the data type of column, specifically a large table. If you search for it, the most common one is, handling this with a new column (which is my 3rd way in the code). However, situation to situation, things can be different, hence tested few ways of changing the type with a table that contains 7 millions records.

Here are 3 ways I tried with. The code uses 3 tables called dbo.Sales_1, dbo.Sales_2, and dbo.Sales_3. The structure and the number of records are same for all 3 tables. Here is the test I did.

-- Method 1
-- ========

-- Check the fragmentation of the table
SELECT * FROM sys.dm_db_index_physical_stats(db_id(), object_id('dbo.Sales_1'), null, null, 'detailed');
-- Result
-- External Fragmentation - 0.3% (Good)
-- Internal Fragmentation - 98% (Good)

-- Changing the column type of Product key from int to bigint
ALTER TABLE dbo.Sales_1
 ALTER COLUMN ProductKey bigint not null;
-- Result
-- Time spent: 4:47

-- Checking fragmentation again
SELECT * FROM sys.dm_db_index_physical_stats(db_id(), object_id('dbo.Sales_1'), null, null, 'detailed');
-- Result
-- External Fragmentation - 99% (Bad)
-- Internal Fragmentation - 51% (Bad)

-- Rebulding the index
ALTER INDEX PK__Sales_1 ON dbo.Sales_1 REBUILD;
-- Result
-- Time spent: 3:43

-- **
-- Total time spent: 8:30
-- **



-- Method 2
-- ========

-- Check the fragmentation of the table
SELECT * FROM sys.dm_db_index_physical_stats(db_id(), object_id('dbo.Sales_2'), null, null, 'detailed');
-- Result
-- External Fragmentation - 0.3% (Good)
-- Internal Fragmentation - 98% (Good)

-- Rebuilding the index with filfactor 80.
ALTER INDEX PK__Sales_2 ON dbo.Sales_2 REBUILD WITH (FILLFACTOR = 80);
GO
-- Result
-- Time spent: 1:38

-- Changing the column type of Product key from int to bigint
ALTER TABLE dbo.Sales_2
 ALTER COLUMN ProductKey bigint not null
-- Result
-- Time spent: 1:13

-- Check for fragmentation
SELECT * FROM sys.dm_db_index_physical_stats(db_id(), object_id('dbo.Sales_2'), null, null, 'detailed');
-- Result
-- External Fragmentation - 0.1% (Good)
-- Internal Fragmentation - 85% (Good)

-- **
-- Total time spent: 2:51
-- **



-- Method 3
-- ========

-- Check the fragmentation of the table
SELECT * FROM sys.dm_db_index_physical_stats(db_id(), object_id('dbo.Sales_3'), null, null, 'detailed');
-- Result
-- External Fragmentation - 0.3% (Good)
-- Internal Fragmentation - 98% (Good)

-- Add a new column with required data type: bigint
-- Setting it as a nullable column
 ALTER TABLE dbo.Sales_3
 ADD NewProductKey bigint null

-- Changing the recovery model
ALTER DATABASE TestDatabase SET RECOVERY Bulk_Logged;

-- Updating the new column with old column value
DECLARE @Count int = 1

WHILE (@Count != 0)
BEGIN

 UPDATE TOP (500000) dbo.Sales_3 WITH (TABLOCK)
  SET NewProductKey = ProductKey
 WHERE NewProductKey IS NULL

 SET @Count = @@ROWCOUNT
END
-- Result
-- Time spent: 1:23

-- Drop the old column
ALTER TABLE dbo.Sales_3
 DROP COLUMN ProductKey

-- Rename the new column as old column
sp_rename 'dbo.Sales_3.NewProductKey', 'ProductKey', 'COLUMN';

-- Alter the current column as non-nullable
ALTER TABLE dbo.Sales_3
 ALTER COLUMN ProductKey bigint NOT NULL
-- Result
-- Time spent: 0.03

-- Check the fragmentation of the table
SELECT * FROM sys.dm_db_index_physical_stats(db_id(), object_id('dbo.Sales_3'), null, null, 'detailed');
-- Result
-- External Fragmentation - 99% (Bad)
-- Internal Fragmentation - 51% (Bad)


ALTER INDEX PK__Sales_3 ON dbo.Sales_4 REBUILD;
-- Result
-- Time spent: 2:55

-- Change back the recovery model
ALTER DATABASE TestDatabase SET RECOVERY Full;

-- **
-- Total time spent: 4:21
-- **

Here is the summary of it;
Method Time Spent Comments
1 - Just alter the column8:30 Takes long time
2 - Alter the column with fill-factor 2:51 Fastest, Probably the best
3 - With a new column 4:21 Good, but need more extra work

My preference is for the highlighted one as it gives the best performance and I do not see any issue with it.

Thursday, March 5, 2015

How to load/insert large amount of data efficiently

Importing data from other sources is a very common operation with most of database solutions as not all data cannot be entered row-by-row. When large amount of data needs to be imported, we always consider about the constraints added to the table and minimal logging expecting improved performance.

SQL Server supports a set of tools for importing data: Import and Export Wizard, bcp (Bulk Copy Program), BULK INSERT, OPENROWSET (BULK). Preference always goes to either bcp or BULK INSERT, however, to improve the performance of it, some of the options have to be considered and set. This post explains how to load a large amount of data using bcp with various options.

For testing the best way, I created a database called TestDatabase and a table called dbo.Sales. I prepared a text file called Sales.txt that has 7,301,921 records matching with dbo.Sales table structure. The Recovery Model of the database is initial set as Full.

Then I loaded the table with following bcp commands. Note that each and every command was run after truncating and shrinking the log file.

--Loding without any specific option
bcp TestDatabase.dbo.Sales in C:\Temp\Sales.txt -S DINESH-PC\SQL2014,8527 -T -c

--Loding with TABLOACK option for forcing to lock the entire table
bcp TestDatabase.dbo.Sales in C:\Temp\Sales.txt -S DINESH-PC\SQL2014,8527 -T -c -h "TABLOCK"

--Loding with our own batch size
bcp TestDatabase.dbo.Sales in C:\Temp\Sales.txt -S DINESH-PC\SQL2014,8527 -T -c -b 3000000 

--Loding with our own batch size and TABLOCK 
bcp TestDatabase.dbo.Sales in C:\Temp\Sales.txt -S DINESH-PC\SQL2014,8527 -T -c -b 3000000 -h "TABLOCK"

Once the table is loaded four times, then the table was loaded again with Bulk Logged recovery model. All commands were executed just like previous one, making sure that log is truncated and shrunk before the execution.

Finally here is the result.

Command  Recovery Model Time (ms) Log file size after loading
With no specific option Full 189,968 3,164,032 KB
With TABLOCK Full 275,500 1,475,904 KB
With batchsize Full 108,500  2,377,088 KB
With batchsize and TABLOCK Full 99,609 1,964,480 KB
With no specific option  Bulk Logged 140,422 3,164,032 KB
With TABLOCK Bulk Logged 239,938 26,816 KB
With batchsize Bulk Logged 121,828 2,377,088 KB
With batchsize and TABLOCK Bulk Logged 86,422 1,475,904 KB

As you see, you can get improved performance in terms of the speed by setting the batch size and tablock with bulk-logged recovery model. If you really consider about log file growth, then tablock option with bulk-logged recovery model is the best.

I believe that BULK INSERT offers the same result, however it is yet to be tested in similar way.

Wednesday, March 4, 2015

SSRS Multi Value Parameters with MDX and checking whether all values are selected

It is a very common question in many forums: How can we check whether all values in the parameter are selected and how to add selected values to a dynamic query that will be used for loading the main dataset. Once I wrote something similar to SQL, here is the link if you need it: http://dinesql.blogspot.com/2011/09/reporting-services-parameters-adding.html

What if you need to generate a dynamic MDX statement for your main dataset and need to get values from a parameter which is formed using a dimension, in to your query. I have seen many different implementations on this: Some have used another hidden parameter for holding the same dataset, and use it for comparing the selection using Iif function for getting the right set. Some have used CountRows functions for comparing number of records in the dataset with number of selected values in the parameter. Most of methods are working fine but there is another easy way of doing it.

If you have built the MDX for the main dataset using the designer, then you know how parameters are handled with it. Designer uses StrToSet MDX function for getting values for multi value parameters. This is the easiest way for handling it and it can be used with dynamic MDX statement too.

Here is an example:

Assume that parameters are loaded using your own query. In that case, you need to use a code like below for loading dimension members.

// There are multiple ways of loading values to parameters
// Either one is okay

// 1.
// This query returns two columns; one for value and one for label
WITH 
 MEMBER [Measures].[ParameterLabel] AS [Dim PoP].[PoP].CurrentMember.Member_Caption 
 MEMBER [Measures].[ParameterValue] AS [Dim PoP].[PoP].CurrentMember.UniqueName
SELECT 
 {[Measures].[ParameterLabel], [Measures].[ParameterValue]} ON 0
 , [Dim PoP].[PoP].[Pop].Members ON 1
FROM [OLAP_Voice];
GO


// 2.
// This query returns only one column
// You need to add two additional calculated fields in the dataset
// 1. Name - ParameterLabel, Value - =Fields!Product.Value
// 1. Name - ParameterValue, Value - =Fields!Product.UniqueName

SELECT {} ON 0
 , [Dim PoP].[PoP].[Pop].Members 
 DIMENSION PROPERTIES MEMBER_CAPTION, MEMBER_UNIQUE_NAME on 1
FROM OLAP_Voice;

Now if you need to check and see whether all values are selected, multiple values are selected or, one value is selected and then use selected one(s) with your main query, without using Iif function, StrToSet can be easily used . This requires two steps;

1. Add a parameter to the main dataset like below;

2. Include the parameter with StrToSet function where you need to have values from the parameter.

" WITH " & 
"  MEMBER [Measures].[Measure1] " & 
"   AS ..... " & 
"  MEMBER [Measures].[Measure2] " & 
"   AS ..... " & 

"  SET [MainSet] " &
"   AS NonEmptyCrossjoin({[Dim Geography].[Country].[Country].Members} " &
"     , StrToSet(@Product) " & // no need to check whether one or many have been selected
"    , { [Dim Department].[Department].[Department].Members} " &


" SELECT " & 
"  {[Measures].[Measure1], [Measures].[Measure2]} ON 0 " & 
"  , {MainSet} ON 1 " & 

" FROM   Cube;"


For more info on StrToSet, read this: https://msdn.microsoft.com/en-us/library/ms144782.aspx




Monday, March 2, 2015

Free Microsoft SQL Server preparation courses at Virtual Academy for exams 464 and 465

Microsoft Virtual Academy has added two new preparation courses for SQL Server exams. Theses two courses cover Developing Microsoft SQL Server Databases (464) and Designing Solutions for SQL Server (465). If you do not like to read articles or books, here is the way of learning areas related to these two exams, all courses are built with videos :).

  1. Developing Microsoft SQL Server Database (464) - 6 modules, approximately 7 hours.
  2. Designing Database Solutions for SQL Server (465) - 5 modules, approximately 6 hours.
In addition to these two, few more preparation courses related to other SQL Server exams are available;

Sunday, March 1, 2015

SQL Server Brain Basher of the Week #001

I have been doing a series of session called SQL Server Brain Basher in SQL Server Sri Lanka User Group (https://www.facebook.com/groups/sqlserveruniverse/) that explains some tricky questions which we cannot answer immediately, unless you have excelled the area related. Thought to have at least one I used before (and planning to use in future) as a blog post in every week, hence, here is the first one which is fairly an easy one.

What would be result of following query?

SELECT 100, 200 '300';

If you are familiar with adding aliases to columns, you can immediately answer. This query returns two columns; first column with value of 100 without a column name, and second with a value of 200 with a column name of 300.



Column alias can be used to relabel column if required. This is really useful if you need to have a different name for the column without having the one coming from the table and name calculated columns. There are multiple ways of adding column aliases, here are some of the ways;

-- Column alias using AS keyword
SELECT 100 AS Column1, 200 AS 'Column2';

-- Column alias with an Equal sign
SELECT Column1 = 100, 'Column2' = 200;

-- Column alias following column name
SELECT 100 Column1, 200 'Column2';

Saturday, February 28, 2015

How to stop users seeing all databases by default

Here is the scenario. You have created a login called "Jane" and have not added to any server role. And you are 100% sure that she has not been given any permission on any database and she has no default database. Here is the statement you use:

CREATE LOGIN Jane
WITH PASSWORD = 'Pa$$w0rd'
, CHECK_POLICY = OFF;

Now she logs into SQL Server as below;



And she sees the object explore. Not only that she can expand the databases node and see databases.



Of course, she cannot go into a database and see the content but this is an issue for many cases, why should we let Jane to see even names of databases?

This becomes possible by default because of the Public server role. By default Public server role has permission on VIEW ANY DATABASE and all Logins are autmatically added to Public role.



If you need to stop this, change this permission like below (you can easily change this using GUI too).

DENY VIEW ANY DATABASE TO Public;

Now Jane cannot see even names of databases unless she has permission on it.


Friday, February 27, 2015

SQL Server Security Concepts: Securables, Pricipals and Permissions

When discussing and learning security related to SQL Server, it is always better to know the key concepts and terms related to it. Some of the concepts are not only related to SQL Server, they are related to almost all applications in terms of security. In simplest way, security is all about allowing someone or something to access a resource and perform actions on it. Here are some terms used when describing it:

Securables
Securable is mainly a resource which we can assign permissions. SQL Server has securables at multiple level of a hierarchical architecture. The hierarchy starts from the server level which includes securables like Endpoints, Logins, Server Roles and Databases. These securables are called as server-level securables as well as server scope securables.

Next level of the hierarchy is the database. Database-level or database scope securables includes items like Users, Database Roles, Certificates and Schemas.

SQL Server includes securables at Schema level too. They are called as schema scope securables that includes resources like Tables, Views, and Procedures.

Principals
The someone or something that perform actions on securables is called as a Principal. There are three types of principals related to SQL Server security: Window's Principals, SQL Server Principals, and Database Principals.

Window's principals and SQL Server principals are considered as server level principals. Windows level principals are generally domain or local server user accounts or groups that are used to connect with SQL Server instance. Authentication is done by either local server or domain controller, and SQL Server trusts the account without performing authentication. SQL Server level principals are logins created at SQL Server instance and authentication is done by SQL Server itself.

Database principals includes database users, database roles and application roles.

** Some principals are also securables. As an example, Login is a principal as well as a securable. It is a principal that because it can access the SQL Server instance and it is also a securable because there are actions that can be performed on it such as enabling and disabling that require permission.

Permissions
Permissions allow principals to perform actions on securables. There are two types of permissions related to SQL Server: Statement permission and Object Permission.

Statement permissions refer actions that can be performed by executing a statement. The principal creating a table using CREATE TABLE statement with CREATE TABLE permission is an example for it.

Object permissions refer actions that can be performed on securables. A principal having SELECT permission on a table is an example for this.

This image shows how these concepts are worked together. Note that the image has been taken from an article in TechNet Magazine. You can refer the article at: https://technet.microsoft.com/en-us/magazine/2009.05.sql.aspx


Tuesday, February 24, 2015

Gartner Magic Quadrant 2015 for Business Intelligence and Analytics Platforms is published

Gartner has released its Magic Quadrant for Business Intelligence and Analytics platforms for 2015. Just like the 2014 one, this shows business intelligence market share leaders, their progress and position in terms of business intelligence and analytics capabilities.

Here is the summary of it;


This is how it was in 2014;


As you see, position of Microsoft has been bit changed but it still in leaders quadrant. As per the published document, main reasons for the position of Microsoft are strong product vision, future road map, clear understanding on market desires, and easy-to-user data discovery capabilities. However, since the Power BI is yet-to-be-completed in terms of complexities and limitations, and its standard-alone version is still in preview stage, including some key analytic related capabilities (such as Azure ML), the market acceptance rate is still low.

You can get the published document from various sites, here is one if you need: http://www.birst.com/lp/sem/gartner-lp?utm_source=Birst&utm_medium=Website&utm_campaign=BI_Q115_Gartner%20MQ_WP_Website