Monday, August 31, 2015

SQL Server Brain Basher of the Week #027 - Business Intelligence

Well known three functionalities provided by a business intelligence solution are; Drilling down, Drilling through and Slicing and dicing. This week question is based on it, here is the question;

What is slicing and dicing in a business intelligence solution?

When data in the data model is analyzed, filters are heavily used and filtering can be done by slicing and dicing dimensions too. Slicing and dicing describe the way you navigate via dimensions for analyzing data. If you are;
  • Slicing
    It means that you are accessing a slice of the model (cube) using one particular member of one dimension. Example, if you are analyzing sales for the date of 20150830 from Date dimension, you are slicing the cube for 20150830 for viewing sales, with All member of all other dimensions.
  • Dicing
    It means that you are accessing a single value selecting one member from all dimensions. Example is, you are accessing sales for 20150830 date, Red pen product, Black Smith customer from a model that has three dimensions: Date, Product and Customer.

Sunday, August 30, 2015

Changing authentication mode without using management studio

.Microsoft SQL Server can be accessed using logins created with Windows accounts/groups or logins created with SQL Server instance. Accessibility is determined with authentication mode set with the instance. SQL Server supports two types of authentication  modes; Windows authentication and Mixed authentication modes. Windows authentication mode allows to create logins using Windows accounts or groups and allows users to login to SQL Server instance using their Windows accounts. Mixed authentication mode allows to create the same, in addition to that, it allows to create SQL Server specific logins supplying names and passwords. Only one mode can be set at a time, it can be initially set during the installation but can be changed after installation using Management Studio;

Can we change this setting only through Management Studio? No, it can be simply changed accessing the registry too. This will be useful if you cannot access the Management Studio for some reasons or you cannot access the the instance using your SQL login because the mode has been reset from Mixed to Windows but you have administrator rights to the server.

Here is the ways of accessing the registry node and changing the mode.
  1. Open the registry using regedit.exe.
  2. Go to HKEY_LOCAL_MACHINE -> SOFTWARE -> Microsoft -> Microsoft SQL Server -> Instance Name -> MSSQLServer.
  3. You should see a value called LoginMode. You can set it to 1 for Windows authentication or can set it to 2 for Mixed authentication mode.
  4. Once changed, restart the SQL Server service.

Saturday, August 29, 2015

Why we need Analytical Data Models?

Among the key elements of business intelligence solution such as data sources, ETL, data warehouse and data models, data models play a major role related to the solution. For the sake of arguments, one can argue saying that analysis can be done with data in the data warehouse without data models but since beginning, for most of implementations, it has been considered as a must. If you are also puzzled, whether you need data models for completing your business intelligence solution, here are some benefits you get from data models that can be helped for taking your decision.
  • Data models help to create a database (or dataset) with known names for entities (or dimensions) and measures used by business users rather opening the data warehouse with its schema. This helps business users to perform analysis with their terms without getting confused with terms added in data warehouse schema.
  • Data models can be created specifically for an operation, process or department allowing business users to focus only on related data. This saves time and reduces the complexity without overloading users with unnecessary data.
  • Data models allow business users (or developers if it is IT-driven data model) to add their own business logic that increases the business value when analyzing data. Good example is, KPI (Key Performance Indicator). Users can create KPIs related their business area for analyzing the business, applying business logic related to them.
  • Data warehouse is large and complex even if it has been configured as a data mart. However, since data models are specific and subject-oriented, comparatively small in size. Yes, it is true that most advance data warehousing platforms offer high performance for all operations but small data models will definitely offer more performance for analysis.
  • We used to create IT-driven data models as either Multi-dimensional or tabular. But as you know, modern business intelligence solutions offer self-service BI, allowing business users to create their own data models using Microsoft Excel or Power BI. This reduces the burden put on IT department and empowers business user.
Here is an illustration that explains how data models are used with modern business intelligence.

Thursday, August 27, 2015

Free Xamarin Subscriptions for Windows Phone Developers

Are you developing Apps for Windows phone and looking for a free subscription for Xamarin? Here is a chance, grab it before 31st of August 2015.

Free Xamarin Subscriptions  for Windows Phone Developers!
Offer expires on August 31st, 2015

Click on the image to get the subscription

Wednesday, August 26, 2015

Product review: CloudBerry Backup - Cloud Backup for Microsoft SQL Server

One of the important assets to an organization is data, because it holds valuable information related to the business, such as customers info, products info, financials info and info related to operations performed. In most businesses, downtime is unacceptable, hence take all possible measures to make sure that data is available, protected and recoverable in the event of a disaster or failure. Organizations deploy different solutions such as high-availability solutions, disaster recovery solutions and backup strategies for addressing these requirements. High-availability and disaster recovery solutions are not common with all organizations but backup strategies are the most common implementation for protecting data.

This post speaks about one of the best products; CloudBerry Backup that can be used for backing up Microsoft SQL Server databases to cloud storage accounts with almost all backup features offered by SQL Server.

CloudBerry Backup for SQL Server allows you to perform your backup operations as per the strategy you have planned. Plan could be a simple Full backup, or it could be a combination of Full, Differential, and Transaction Log backups based on RPO and RTO required by the business. You may encrypt it, you may compress it and most importantly you may expect point-in-time restoration in case of data-error or a failure. Not only that, you may want to store your backups in a cloud storage account such as Microsoft Azure or Amazon, or OneDrive, making sure backups are secured and protected. Yes, you can satisfy some of these requirements using functionalities provided with SQL Server but not all. For example, there is no direct way of placing backups into a different cloud storage account other than Azure. It is also available only with SQL Server 2012 SP1 or later, hence requires additional coding if it needs to be done with a prior version. Good news is, all these requirements can be satisfied with the CloudBerry Backup product regardless of the SQL Server version. Not only that, it does not require administration knowledge allowing even a beginner to perform backup operation as per the strategy established.

Let's see how this product works and how easily we can backup our databases.

CloudBerry Backup can be downloaded from following page in CloudBerry Lab site.

Once the download is completed, install it and open it. You will get the product activation window, if you have the license key, enter it and get it activated, else you con continue with trial version clicking Continue Trial button.

CluoudBerry Backup opens with its Welcome window. You can create your backup plan by clicking either MS SQL Server button in the Home ribbon or Backup SQL Server button in the welcome screen.

Before start creating the plan, you need to prepare for two things; backup strategy and cloud storage account. Let's plan on AdventureWorks database and Microsoft Azure as the storage. If you need, you can use your own database, else download AdventureWorks2014 and restore it as AdventureWorks2014_New for testing. Assume that backup plan as per the RPO and RTO is as follows;

  • Full database backup - every Saturday - 4:00am
  • Differential backup - week days - 14:00am
  • Transaction log backup - week days - every hour - day time
If you already have an account with Microsoft Azure, use it for creating a storage, else create an account and then create a storage. Here is my Microsoft Azure Storage configured.

Everything is ready, let's start creating the plan over a database AdventureWorks2014_New. Click on Backup SQL Server button in Welcome screen. It opens Welcome screen of MS SQL Server Backup Plan Wizard.

Click on Next to continue. Next screen is for selecting the Cloud Storage account. Good thing with CloudBerry Backup is, it allows us to select our preferred cloud storage account from a lengthy list without just offering one platform. Let's select Azure for this.

You need to configure Azure account with this. Select , it opens a window for entering account details. Note that it needs a container too. Once the Display Name, Account and Shared key are entered, select  in the Container dropdown and give a name for it.

Click OK twice and click on Next to continue. You will get screens for naming the plan, selecting the SQL Server with credentials and selecting the database.

As you see with the last scree, you can either select one database or multiple databases. Since the plan is for one database, select the one you need. Click on Next to continue. Next option is for encrypting and compressing the backup. CloudBerry backups support encrypting and compressing on almost all versions of SQL Server starting from SQL Server 2000.

Next screen is for selecting or configuring purging options. You can either accept the default or can configure as you want.

Click on Next to continue. This screen is for selecting the schedule with options. This screen allows you to take a backup without creating a schedule or a plan, schedule it for a specific time, go for a general strategy with a predefined templates or go for a customized  schedule. Not only that, this has two more additional , important options Stop the plan if it runs for n hours and Run missed scheduled backup immediately when computer starts up. This is something we do not get with general backup in SQL Server. Since we need a customized plan, let's select Recurring (advanced schedule).

Next screen is for executing additional operations before performing the backup and after performing the backup. It allows you to select files like exec, bat files. If you have anything to be done, add them with this screen.

Next screen is for configuring notifications. Various options are given, email notifications can be set up either using CloudBerry facility or using SMTP configured. In addition to that, entries can be made to Windows Event Log too.

Done. Click on Next to see the Summary. Then click on Next and Finish selecting Run backup plan now.

Plan is created. It shows you the details of the plan and Next Run Time.

It allows you to take a backup even though it is scheduled. If you click on Run backup now, it will take a full backup and place in your Azure storage.

Since we have enabled notification, this sends us an email;

You can see how files are created and placed them in your storage account. If you use Microsoft Azure, then portal can be used for seeing files in the container or a tool like CloudBerry Explorer for Azure Blob Storage can be used.

Restoring databases from CloudBerry Backup is an easy task. It allows you to either restore a backup from a stored backup or create a plan on restoration. Point-In time restoration is also possible with this and it will pick the right files based on the date and time given. Here are few important screens of restore operations;

In addition to the screens above, screens are there for decryption, server selecting, emailing, etc.

As you see, it is really easy to manage your SQL Server backup strategies with CloudBerry Backup. If you still thinking why should you go for this tool, consider following;
  • Easy management and scheduling, plan can be simply created.
  • Any cloud storage, not just one. Storing them directly to Azure without doing any configuration to the instance.
  • Encryption and compression on any version of SQL Server without creating master keys and encryption keys.
  • Purging options
  • Pre and post actions
  • Notification facility even without SMTP configuration.
  • Backup file management in the container configured.
Try with fully function trial, go to and download it today. You will see how useful and worth it soon.

Tuesday, August 25, 2015

Who modified my tables: tracking SQL Server table changes

What is the easiest way of capturing changes on tables? I was asked this question and what immediately came to my mind was DDL Triggers. Though this is not a new feature of SQL Server, it is still unknown to many and various ways are implemented for capturing changes.

DDL Trigger can be implemented for capturing almost all DDL changes. Here is the code for capturing changes made to tables;

USE AdventureWorks2014;

-- Creating a table for holding changes
CREATE TABLE TableChanges 
 ChangedDate datetime,
 EventType nvarchar(100), 
 TableName nvarchar(100), 
 LoginName nvarchar(100), 
 SQLCommand nvarchar(2000));

-- Creating the DDL trigger for tables
CREATE TRIGGER CaptureTableChanges 


 SET @data = EVENTDATA();

 INSERT TableChanges 
  (ChangedDate, EventType, TableName, LoginName, SQLCommand) 
  @data.value('(/EVENT_INSTANCE/EventType)[1]', 'nvarchar(100)'), 
  @data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'varchar(100)'),
  @data.value('(/EVENT_INSTANCE/LoginName)[1]', 'varchar(100)'),
  @data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'nvarchar(2000)')
  ) ;

-- Creating a login and user for testing
ALTER ROLE db_ddladmin ADD MEMBER Jack;

-- Use a different window for this
-- Connect using Jacks credentials and execute
ALTER TABLE Production.Product
 ALTER COLUMN Color nvarchar(20);

-- Check captured changes
SELECT * FROM TableChanges ;

Sunday, August 23, 2015

SQL Server Brain Basher of the Week #026 - Variable types

Similar to other programming languages, SQL also offers a facility to create variables for holding values temporarily. Variables are created in the memory and can be accessed only within the batch in which they have been declared. Here is a simple question on variables;

What are the three types of variables can be created with SQL Server?
  1. Local variable
  2. Global variable
  3. Temporary variable
  4. Cursor variable
  5. Table variable
Local variables are known to everyone and it is the type used widely. By default, NULL is assigned to all local variables if no default value is assigned. Values for local variables can be set at the time of declaration, using a SELECT statement or using the SET statement.

DECLARE @i int;

-- this returns NULL

DECLARE @x int = 100;

DECLARE @y int, @z int;

SET @y = 300;
SELECT @z = 400;

SELECT @X, @y, @z;

Cursor variables are still given with SQL Server mainly for backward compatibility. Since we can just name to cursor and do all our operations with its name, there is no necessity on declaring a cursor with a variable. Anyway here is an example;

DECLARE @i int;

 SELECT ProductID FROM Production.Product
 WHERE Color = 'Black';

OPEN @cur;


CLOSE @cur;

Table variables are useful when we need to maintain a small dataset for temporary operations. It works just like local variables and it holds data in the form of a table. Here is an example of it.

 , Name varchar(100) NOT NULL
 , Color varchar(100) INDEX ix_Product NONCLUSTERED

SELECT ProductID, Name, Color FROM Production.Product
WHERE Color IS NOT NULL OR Color != '';

SELECT * FROM @Product;

As you see, three types of variables that can be declared with SQL Server are Local variables, Cursor variables, and Table variables. There are no types called Global and Temporary variables.

Friday, August 21, 2015

Deleting records based on another table: Should I use standard delete or non-standard delete statement?

When you have to delete records from a table based on a filter set with another table, what type of statement generally you use? DELETE based on a join or DELETE based on a sub query?

The standard way of deleting records for this requirement is using DELETE based on a sub query. However if you prefer or more familiar with JOINs, it can be used without any issue though it is considered as the non-standard way. SQL Server executes both types of queries in same way, hence we cannot expect any performance difference between these two.

Here is an example. Following code deletes records from SalesOrderDetails table based on Color attribute of Product table. As you see, cost is same and plan is same too.

USE AdventureWorks2014;

-- non-standard way
FROM Sales.SalesOrderDetail d
 JOIN Production.Product p
  ON d.ProductID = p.ProductID

-- standard way
DELETE FROM Sales.SalesOrderDetail
   FROM Production.Product p
   WHERE Sales.SalesOrderDetail.ProductID = p.ProductID
    AND p.Color IS NULL);

Thursday, August 20, 2015

Stored procedure parameters: Setting default values with functions

Stored procedures are common objects in SQL Server databases as it gives many benefits for us. One important thing with Stored Procedure is, allowing us to pass parameters for executing the procedure as we want. Parameters can be set as either input or output parameters and input parameters can be set with default values allowing users to execute the procedure either with value they need or without passing a value, letting procedure to use the default value. However, can we set up a function for the default value? That is the question I was asked.

SQL Server allows us to pass 2,100 parameters per Stored Procedure. Parameters can be set only with constant expressions, hence functions CANNOT be used when setting default values to parameters. But we can workaround it, though it is not the exact solution;

USE AdventureWorks2014;

IF OBJECT_ID('dbo.GetOrders', 'P') IS NOT NULL
 DROP PROCEDURE dbo.GetOrders;

-- creating the procedure
-- note that default value is null
-- and set the default value if no value found
CREATE PROCEDURE dbo.GetOrders @Year int = NULL

 SET @Year = ISNULL(@Year, YEAR(getdate()));

 print @Year
 SELECT * FROM Sales.SalesOrderHeader
 WHERE YEAR(OrderDate) = @Year;

-- EXEC dbo.GetOrders

Wednesday, August 19, 2015

Encrypting backups and restoring encrypted backups

Your applications cannot be survived without database backups!

Database backup is a fundamental requirement and it is the key task among maintenance tasks. It protects both application and data against almost all failures and it is our duty to protect it. Not only that, it should be well secured because it holds all information related to your organization, hence you should take all possible actions for protecting it from unauthorized access. One method we use for protecting our backups is using an off-site storage, making sure that we have them at then event of a disaster. In addition to that, it can be more secured applying encryption. This post speaks about encrypting SQL Server database backups and restoring them.

SQ:L Server backup encryption uses algorithms such as AES 128, AES 192, AES 256, and Triple DES. And when encrypting, it needs either a certificate or asymmetric key for encrypting data. Here are the steps for preparing the instance and taking a backup.

  1. Create the database master key in the master database if it is not created before. This key is a symmetric key and it will be used for protecting all other keys and certificates.

    -- creating the master key if not exist
    USE master;
  2. Backup the master key created and keep it in a secure location. This is required if the database going to be restored to a different instance.

    -- Back up the database master key
    BACKUP MASTER KEY TO FILE = 'D:\SecuredLocation\master.key'
     ENCRYPTION BY PASSWORD = 'Ma$terPa$$w0rd';
  3. Create a certificate or asymmetric key for encrypting backups.

    -- Create a certificate
    CREATE CERTIFICATE AdventureWorksBackupCertificate
     WITH SUBJECT = 'Backup Encryption Certificate';
  4. Backup the certificate or asymmetric key and keep it in a secure location. This is required if the database going to be restored to a different instance.

    BACKUP CERTIFICATE AdventureWorksBackupCertificate 
    TO FILE = 'D:\SecuredLocation\AdventureWorksBackupCertificate.cer'
    WITH PRIVATE KEY ( FILE = 'D:\SecuredLocation\AdventureWorksBackupCertificate.key' ,
         ENCRYPTION BY PASSWORD = 'CertificatePa$$w0rd');
  5. Perform the back up operation using the certificate or asymmetric key created, mentioning the algorithm for encryption.

    -- Take the initial backup
    BACKUP DATABASE AdventureWorks2014
     TO DISK = 'D:\BackupLocation\AdventureWorks2014.bak'
      SERVER CERTIFICATE = AdventureWorksBackupCertificate)

Here are the steps for restoring the backup to the same instance.

  1. Restore the backup just like restoring non-encrypted backup. No special steps need to perform as long as master database key and certificate are exist.

    -- Same Server - restore as a normal backup
    RESTORE DATABASE AdventureWorks2014
     FROM DISK = 'D:\BackupLocation\AdventureWorks2014.bak'

Here are the steps for restoring the backup to a different instance.
  1. Create a master key if it is not exist. This does not need to be the same. You can use the backup of the master key taken if it is a recovery from a complete failure.

    -- create the master key if not exist
    USE master;
  2. Create the certificate or asymmetric key from the backup taken from original instance. Name should be same.

    -- create the certificate in the new instance
    -- from backup files
    CREATE CERTIFICATE AdventureWorksBackupCertificate
    FROM FILE = 'D:\NewServerSecuredLocation\AdventureWorksBackupCertificate.cer'
    DECRYPTION BY PASSWORD = 'CertificatePa$$w0rd',
    FILE = 'D:\NewServerSecuredLocation\AdventureWorksBackupCertificate.key');
  3. Perform the restoration as a normal restoration.

    RESTORE DATABASE AdventureWorks2014
     FROM DISK = 'D:\NewServerBackupLocation\AdventureWorks2014.bak';

Monday, August 17, 2015

SQL Server Brain Basher of the Week #025 - DMVs and DMFs

The best and easiest way of getting real-time state information and recent historical information related to SQL Server is using Dynamic Management Views and Dynamic Management Functions. This week question is based on them, here is the question;

If I want to see DMVs and DMFs using Object Explorer, where should I look for?
  • Master database
  • Resource database
  • Any user-defined database
  • MSDB database
Generally all DMVs and DMFs are defined in the resource database and then mapped to other databases. If you want to see DMVs using Object Explorer, they can be seen in System View node of any database.

If you want to see DMFs using Object Explorer, they can be seen only in Table Valued Function node of master database.

Sunday, August 16, 2015

Differences between BCP and BULK INSERT

It is not uncommon to see importing data from other sources and exporting data into other sources as not all data can be entered row-by-row. Generally, either import or export, it can be considered as an ETL operation and some involve with transformation (T) and some do not. If the requirement is just loading a bulk data into SQL Server without transformation, easiest and fastest way of loading data is either using bcp or BULK INSERT. What is the best? Here is a comparison.

Operation bcp BULK INSERT
Type of utility Command line utility  T-SQL utility
Usage for both import and export  Import only
Required TSQL knowledge No, unless queryout is used for writing a query without mentioning the table name. Required. 
Constraint enforcement Does not check CHECK and FOREIGN KEY constraints unless CHECK_CONSTRAINTS is used with -h hint. UNIQUE, PRIMARY KEY, and NOT NULL constraints are always enforced. Does not check CHECK and FOREIGN KEY constraints unless CHECK_CONSTRAINTS is used. UNIQUE, PRIMARY KEY constraints are always enforced. Inserts blank to NOT NULL character type columns if no value is passed to the column.
Explicit transaction Not possible Possible
Format file Can generate and use Cannot generate but can be used.
Source location Can be referred a location in client ENV Can be refereed only a location within the server.
Performance Fast Fastest
Protocol TCP/IP
Shared Memory
Named Pipes
In memory

Saturday, August 15, 2015

Finding user tasks that are waiting longer than expected time

Performance tuning is one of the common operations in database world and is always done for optimizing queries. When performing Performance Tuning first thing we generally do is, checking tasks that are waiting for a long time period. A question was asked on it, and as usual thought to make a post on it, how can I see whose tasks are waiting longer than expected time?

There are couple ways of seeing it. Activity monitor can be easily used for this but if it needs to be done using TSQL, using Dynamic Management Views and Dynamic Management Functions are the best. Here is a simple script for finding tasks that are waiting........

Execute this in a new window
Note that it updates a record
within an explicit transaction
USE AdventureWorks2014;


UPDATE Production.Product
 SET Color = 'Black'
WHERE ProductID = 1;


Execute this in a new window
This tries to retrieve the same
USE AdventureWorks2014;

SELECT * FROM Production.Product
WHERE ProductID = 1;

Execute this in a new window
This looks for queries waiting 
longer than 20 seconds
 , t.wait_type
 , t.wait_duration_ms,
FROM sys.dm_os_waiting_tasks AS t
 INNER JOIN sys.dm_exec_sessions AS s
  ON t.session_id = s.session_id
 INNER JOIN sys.dm_exec_requests r
  ON s.session_id = r.session_id
 CROSS APPLY sys.dm_exec_sql_text(r.sql_handle) q
WHERE s.is_user_process = 1
AND t.wait_duration_ms > 20000; -- waiting more than 20 seconds

Friday, August 14, 2015

Can I disable all constraints when loading data?

Generally, when loading a large data set, we disable constraints and drop indexes for speeding up the loading process. Constraints help us to enforce integrity rules making sure that no inconsistent data is exist though it slows down update operations on the table. There are different types of constraints for addressing different types of integrity such as;
  • Entity integrity - referring row level integrity
  • Domain integrity - referring column level integrity
  • Referential integrity - referring integrity with relationships
In addition to that, constraints can be used for addressing user-defined integrity too.

SQL Server supports following constraints types;
  • PRIMARY KEY constraint
  • UNIQUE KEY constraint
  • FOREIGN KEY constraint
  • CHECK constraint
  • NOT NULL constraint
  • DEFAULT constraint
However, note that we cannot disable all constraints. Only CHECK and FOREIGN KEY constraints can be disabled. For disabling other constraints, either constraints have to be modified or dropped. For example, if you need to disable either PRIMARY KEY or UNIQUE KEY constraint, index attached for them should be dropped.

Here is the code for disabling CHECK constraint.

-- Create a table with a check constraint
USE tempdb;

CREATE TABLE dbo.Customer
 CustomerId int not null
 , CustomerName varchar(10) not null
 , CreditAmount decimal(18,2) not null
 , CONSTRAINT Customer_SalaryCheck CHECK (CreditAmount < 100000)

-- Inserting valid data
INSERT INTO dbo.Customer
 VALUES (1, 'Joe Brown', 65000);
INSERT INTO dbo.Customer
 VALUES (2, 'Mary Smith', 75000);

-- Inserting invalid data
-- This will not be possible as it violates the constraint
INSERT INTO dbo.Customer
 VALUES (3, 'Joe Brown', 110000);

-- Disabling the constraint and inserting the same again
-- Now it allows to insert a row with an invalid value
ALTER TABLE dbo.Customer NOCHECK CONSTRAINT Customer_SalaryCheck;

INSERT INTO dbo.Customer
 VALUES (3, 'Joe Brown', 110000);

-- Enabling the constraint.
-- Note that updated records are not validated
ALTER TABLE dbo.Customer CHECK CONSTRAINT Customer_SalaryCheck;
-- Now table contains invalid data.

Note that it does not validate records inserted when enabling it again. If you want to check all records for integrity when enabling, use following code.

-- If you want to make sure no invalid data exist
-- and validate all records agains the constraint
-- disable it again and enable it with CHECK option
ALTER TABLE dbo.Customer NOCHECK CONSTRAINT Customer_SalaryCheck;

-- This will throw and error because current rowset has an invalid value.
-- This can be corrected by either updating it or deleting it.
DELETE FROM dbo.Customer WHERE CustomerId = 3;

-- Now constraint can be enabled with CHECK option
-- And if you run sys.check_constraint again, you will see 0 for is_not_trusted column.

DROP TABLE dbo.Customer;

Thursday, August 13, 2015

Ways of seeing Query Execution Plans

When it comes to troubleshooting, most of the time, first thing we do is, checking the query execution plan related to the query for understanding what has happened with the query, whether added indexes have been used, JOINs used as we expected, any differences between estimated rows and actual number of rows accessed and, much more. Since Query Execution Plan explains many things, it is always better to understand what it is, how it can be seen and how the guidelines given with it have to be used for troubleshooting. First of all we need to know all possible ways of seeing it, hence this posts speaks about it.

SQL Server allows us to see the plan mainly in three different ways;
  • Graphical execution plan
  • Text execution plan
  • XML execution plan
Most of us are familiar with Graphical Execution Plan as that is what we generally use. However there are some more ways of seeing two types of execution plans;
  • Estimated plan - Complied plan for the given request. Result of the request of it is a SELECT or modification related to the request if it is an action query such as INSERT, UPDATE, DELETE, will not be produced or done.
  • Actual plan - Complied plan along with runtime statistics. This can be seen with the execution of the request and shown with the result.
Here are the ways of seeing these plans;
  • Estimated plan (compile)
    • SET SHOWNPLAN_TEXT (deprecated) 
    • SET SHOWPLAN_ALL (deprecated) 
    • Graphical showplan (SSMS)
    • sys.dm_exec_query_plan, sys.dm_exec_text_query_plan
  • Actual plan (runtime)
    • SET STATISTICS PROFILE (deprecated) 
    • Graphical showplan (SSMS)
Let's see each an every way for seeing the execution plan.

This helps us to see how engine will execute the given query without executing it. The output of this show a hierarchical tree that explains steps taken and actions performed for executing the query. This is not much usable when compared to other ways but good for getting an understanding on each steps and their execution. However, since this has been marked as deprecated, it is not recommended to use.

USE AdventureWorks2014;


FROM Sales.SalesOrderHeader h
 INNER JOIN Sales.SalesOrderDetail d
  ON h.SalesOrderID = d.SalesOrderID
WHERE h.CreditCardApprovalCode = '108674Vi12864';

This is a deprecated set statement too. Just like the previous one, this does not show actual statistics but the plan is shown in tabular manner, not as a text like previous one, hence more readable. As you see, it is easy to see logical and physical operations that will be used, estimated rows, estimated CPU, etc.

USE AdventureWorks2014;


FROM Sales.SalesOrderHeader h
 INNER JOIN Sales.SalesOrderDetail d
  ON h.SalesOrderID = d.SalesOrderID
WHERE h.CreditCardApprovalCode = '108674Vi12864';

This deprecated set statement allows us to see the estimated execution plan as an XML. It can be open as an XML or graphical plan can be seen too.

USE AdventureWorks2014;


FROM Sales.SalesOrderHeader h
 INNER JOIN Sales.SalesOrderDetail d
  ON h.SalesOrderID = d.SalesOrderID
WHERE h.CreditCardApprovalCode = '108674Vi12864';


For more info:

Graphical Showplan (SSMS) - Estimated
This is the simplest way of seeing the estimated execution plan without executing the query. All we have to do is, click the icon called Display Estimated Execution Plan (Ctrl + L) in the SSMS toolbar selecting the query. The plan shows is well readable and provides lot of information on the execution. Each step is shown with a related icon. More information related to them can be seen using the tooltip and details can be seen with property window.

This dynamic management function allows us to see estimated plans from the cache. It requires the Plan Handle as a parameter. Required plan handled can be taken from sys.dm_exec_cached_plans dynamic management view. sys.dm_exec_query_plan returns the showplan in XML and just like SET SHOWPLAN_XML, it can be open as a graphical plan or an XML.

-- Cleaning the cache for testing (should not do this in PRD)

-- Execute a sample query
FROM Sales.SalesOrderHeader h
 INNER JOIN Sales.SalesOrderDetail d
  ON h.SalesOrderID = d.SalesOrderID
WHERE h.CreditCardApprovalCode = '108674Vi12864';

-- Checking cached plan and get the handle for above query
 cp.plan_handle, cp.cacheobjtype, s.text
FROM sys.dm_exec_cached_plans cp
 CROSS APPLY sys.dm_exec_sql_text(cp.plan_handle) s
WHERE s.text LIKE '%108674Vi12864%';

-- Get the query plan using the handle
SELECT dbid, query_plan
FROM sys.dm_exec_query_plan(0x0600070027CDCA1D506292630400000001000000000000000000000000000000000000000000000000000000);

Once you have the link like above, it can be open as a graphical showplan or an XML (just like SET SHOWPLAN_XML)

Just like the previous one, this can be used for seeing the estimated plan for queries. The difference between this and above is, this does not provide a clickable link for opening the graphical showplan. However it can be save as a plan and open it separately as a graphical showplan.

-- Get the query plan using the handle
SELECT dbid, query_plan
FROM sys.dm_exec_text_query_plan(0x0600070027CDCA1D506292630400000001000000000000000000000000000000000000000000000000000000, 0, -1);

This can be used for seeing the plan with both estimated values and actual values. Not only that, it execute the query and shows the result too. The output related to the plan is similar to the output generated with SET SHOWPLAN_TEXT.

USE AdventureWorks2014;


FROM Sales.SalesOrderHeader h
 INNER JOIN Sales.SalesOrderDetail d
  ON h.SalesOrderID = d.SalesOrderID
WHERE h.CreditCardApprovalCode = '108674Vi12864';


This is similar to previous one, only difference is, it shows plan as a clickable XML and it can be used for seeing the graphical showplan as well as XML.

USE AdventureWorks2014;


FROM Sales.SalesOrderHeader h
 INNER JOIN Sales.SalesOrderDetail d
  ON h.SalesOrderID = d.SalesOrderID
WHERE h.CreditCardApprovalCode = '108674Vi12864';


Graphical Showplan (SSMS) - Actual
This is most used way of seeing the plan. Since this can be enabled using a toggled icon given with the SSMS, and it shows the plan with actual statistics while executing and producing the result, we can simply use it for understating the query and troubleshooting issues.

In addition to these, we can use SQL Server Profiler and Extended Events for seeing plans too.

Tuesday, August 11, 2015

Granting only SELECT permission on all tables on all databases quickly

Different requirements pop up at different time, unexpectedly. What if someone needs to access all your databases and needs to see all records? This someone could be an Auditor, Security Consultant, or Risk Analyzer. How do you manage it? Create a login account and add user accounts with it to all databases, and then add user accounts to data_reader role? Yes, it might be the way we use it, but remember, there is an easy way, shorter path to do this. It was introduced with SQL Server 2014.

This type of requirement can be managed with granting CONNECT ANY DATABASE permission to the login and then set SELECT ALL USER SECURABLES to the login. CONNECT ANY DATABASE permission allows login account to connect with current databases and even databases that will be created later. It does not allow to do anything else, it is purely for connecting.

Here is the way of doing it;

This code creates a login called Jack.

USE master;

If Jack connect to SQL Server instance, he can see databases but he cannot connect with them.

If we grant him CONNECT ANY DATABASE permission;

USE master;

Then he can connect.

Even though Jack can connect, he will not be able to see any of user-defined tables. If the requirement is for allowing him to read data;

USE master;

Now he can read records.