Saturday, October 22, 2016

How to restirct access to DMVs and DMFs for users

We heavily use Dynamic Management Views and Functions for retrieving insight of inner operations of SQL Server and they are very useful. These DMVs and DMFs are based on Dynamic Management Objects (DMOs).

Generally, if user has permission on VIEW SERVER STATE and VIEW DATABASE STATE, user can execute almost all DMVs and DMFs. If you are a member of sysadmin server role, then you automatically get permission on VIEW SERVER STATE and if your login is a standard login and it has been added to one of the databases as a member of db_owner role, then you have VIEW DATABASE STATE permissions. These permission can be granted explicitly without adding logins or users to the specific roles.

However, if you need to restrict accessing specific DMVs or DMFs to users on one or more databases, there is an easy way of doing it. All we have to do is, restrict access on the specific DMVs and DMFs in the master database to user. It makes sure that user cannot access them regardless of the database context. 

Let's assume that user Jane should not be able to execute sys.dm_sql_referenced_entities DMF in any database. Here are the steps for doing it;

1. Create a user in master database for Jane's login.
2. Open properties of User Jane and click on Securables for listing required objects.

3. Click on Search for finding objects. Select All objects of the types... and click OK.

4. Select Table-valued function and click OK.

5. Select sys.dm_sql_referenced_entities DMF and set Deny permission on SELECT.

6. Click OK to save settings.

Now if you Jane tries to execute this DMF, she will get an error on it as below;

Friday, October 21, 2016

How to find all tables and columns used in the Stored Procedures?

As a part of modulerizing business logic, we create stored procedures and functions with SQL Server databases and in most cases, stored procedures are used. A stored procedure is named collection of TSQL statements that is created for implementing a business logic. 

When implementing a stored procedure, you will use one or more tables within it and later you may need to find out all tables along with columns that have been used with your specific stored procedures. Yes, you can open the stored procedure and see the code for finding them but what if you need to do it programmitically and not only one, for few stored procedures? Or what if you need to check and see a particular table has been used with your specific stored procedures? How can we find out these info, or in other words, how can we find out dependencies?

There are few ways of finding them using system views and functions given. We can use  sys.sql_expression_dependencies  view for finding objects used with a user-defined object and it can be used with stored procedures. However, since it returns one-row-per-one-object, you will not be able to get the column list of a table if you use this with a stored procedure. The dynamic management function that is sys.dm_sql_referenced_entities can be used as well. This gives more details on referenced objects. For an example, if you search for a stored procedure, it returns all tables along with column names used with the stored procedure.

Following code shows how to get the tables and columns used in dbo.uspGetBillOfMaterials stored procedure. As you see, sys.sql_expression_dependencies does not provide column details but sys.dm_sql_referenced_entities dynamic management function provides all the details we need.

USE AdventureWorks2014;

SELECT * FROM sys.sql_expression_dependencies
WHERE referencing_id = OBJECT_ID(N'dbo.uspGetBillOfMaterials');  

FROM sys.dm_sql_referenced_entities(N'dbo.uspGetBillOfMaterials', N'OBJECT');

In case of you need to find out whether the object is being used other objects, we have another dynamic management function named sys.dm_sql_referencing_entities. This returns details of all referencing objects.

FROM sys.dm_sql_referencing_entities(N'Production.Product', N'OBJECT');

In case of you need to know how your object has been used by others, here is the way. This shows how to use sys.dm_sql_referencing_entities dynamic management function to see how the Production.Product table is used by other objects.

Sunday, October 16, 2016

Memory-Optimized table memory limit - SQL Server Brain Basher of the Week #057

Let's talk about memory-optimized tables as a part of Brain Basher in this week. This question was raised during one of my workshops and it is an obvious question comes to your mind when planning on memory-optimized tables. Here is the question/

How much memory memory-optimized table can consume maximally?

If the version is SQL Server 2014, then there is a limitation on durable tables which is 256GB and no limitation on non-durable tables. However, this has been addressed with SQL Server 2016 and now NO LIMITATIONS on both durable and non-durable tables.

You can read more info on this enhancement at:

Read some post made on memory-optimized tables;
In-Memory OLTP: Inserting 3 millions records in 3 seconds
How to check tables and stored procedures for in-memory migration

Saturday, October 15, 2016

SQL Server needs memory optimized file group even for non-durable memory-optimized tables?

We know that SQL Server requires an additional file group that is marked as MEMORY_OPTIMIZED_FILEGROUP and a data file associated with it if memory-optimized tables need to be created. We know for sure that memory-optimized tables that are created as SCHEMA_AND_DATA require data files to write data to disk because they are coexist with disk-based tables but tables that are created as SCHEMA_ONLY require the same?

Memory-optimized tables that are created as non-durable (SCHEMA_ONLY) maintain both data and indexes in memory. Therefor, theorically it does not need anything to be written to disk. However, if you try to create a non-durable memory-optimized table without adding a memory-optimized-file-group, you get the following message;

Cannot create memory optimized tables. To create memory optimized tables, the database must have a MEMORY_OPTIMIZED_FILEGROUP that is online and has at least one container.

This says even for non-durable tables, it needs the file group. Although it does not maintain data in the disk, it has to create the table as the schema has to be maintained. Therefore, regardless of the type of memory-optimized tables, file group with MEMORY_OPTIMIZED_FILEGROUP option should be added to the database with a data file before creating memory-optimized tables.

Thursday, October 13, 2016

SQL Server 2016 Partial backup operation on a READONLY database backs up all file groups

While I was writing a code on backing up operation for SQL Server 2016, I noticed that there is an issue with partial backup. As per documentations, when we take a partial backup using READ_WRITE_FILEGROUPS option, it backs up;

  • only PRIMARY, all READ/WRITE file groups and mentioned READONLY file groups.
  • only PRIMARY if the database is READONLY.

However, if you perform this operation with SQL Server 2016 RTM, it backs up;
  • all file groups; PRIMARY, all READ/WRITE file groups including READONLY file groups - which is NOT the expected behavior.
You can apply the latest cumulative update (I applied CU2), then it backs up;
  • only PRIMARY and all READ/WRITE file groups - which is still NOT the expected behavior.
Here is the code I tested, you also can run the same and see;

USE master;
-- creating the database
( NAME = N'TestDatabase_Data_1', FILENAME = N'D:\MSSQL13.MSSQLSERVER\MSSQL\DATA\TestDatabase_Data1.mdf' 

 FILEGROUP [FileGroup_Main] 
( NAME = N'TestDatabase_Dat_2', FILENAME = N'D:\MSSQL13.MSSQLSERVER\MSSQL\DATA\TestDatabase_Data2.ndf' 
 FILEGROUP [FileGroup_History] 
( NAME = N'TestDatabase_Dat_3', FILENAME = N'D:\MSSQL13.MSSQLSERVER\MSSQL\DATA\TestDatabase_Data3.ndf' 

( NAME = N'TestDatabase_Log', FILENAME = N'D:\MSSQL13.MSSQLSERVER\MSSQL\DATA\TestDatabase_Log.ldf' 
 , SIZE = 5120KB , MAXSIZE = 2048GB , FILEGROWTH = 10%)

-- creating tables and inserting records
USE [TestDatabase]

CREATE TABLE [dbo].[Configuration]
 [ConfigurationName] [varchar](100) NOT NULL,
 [ConfigurationValue] [varchar](100) NOT NULL,
 [Settings] char(4000) NOT NULL


CREATE TABLE [dbo].[Sales]
 [SalesDate] [date] NOT NULL,
 [Amount] [decimal](16, 2) NOT NULL,
 [Settings] char(4000) NOT NULL
) ON [FileGroup_Main]

CREATE TABLE [dbo].[Sales_History]
 [SalesDate] [date] NOT NULL,
 [Amount] [decimal](16, 2) NOT NULL,
 [Settings] char(4000) NOT NULL
) ON [FileGroup_History]

-- Insert set of records
INSERT INTO dbo.Configuration (ConfigurationName, ConfigurationValue, Settings) VALUES ('A', '1', 'a');
GO 5000
INSERT INTO dbo.Sales (SalesDate, Amount, Settings) VALUES ('01/01/2016', 10000, 'a');
GO 5000
INSERT INTO dbo.Sales_History (SalesDate, Amount, Settings) VALUES ('01/01/2010', 10000, 'a');
GO 5000

USE master;

-- Make one FG as read-only

-- take a full database backup
-- this takes all three files
-- backup size is 63 MB

-- take a backup with READ_WRITE_FILEGROUPS
-- this takes only TestDatabase_Data_1 and TestDatabase_Dat_2
-- backup size is 43 MB

-- Make the database as read-only

-- take a backup with READ_WRITE_FILEGROUPS
-- as per MSDN, this should take only PRIMARY
-- But this takes all three files with RTM and exlude read-only files with CU2
-- backup size is 63 MB

Result before applying CU2;

Processed 2840 pages for database 'TestDatabase', file 'TestDatabase_Data_1' on file 1.
Processed 2528 pages for database 'TestDatabase', file 'TestDatabase_Dat_2' on file 1.
Processed 2528 pages for database 'TestDatabase', file 'TestDatabase_Dat_3' on file 1.
Processed 2 pages for database 'TestDatabase', file 'TestDatabase_Log' on file 1.
BACKUP DATABASE successfully processed 7898 pages in 0.275 seconds (224.357 MB/sec).
Processed 2840 pages for database 'TestDatabase', file 'TestDatabase_Data_1' on file 1.
Processed 2528 pages for database 'TestDatabase', file 'TestDatabase_Dat_2' on file 1.
Processed 2 pages for database 'TestDatabase', file 'TestDatabase_Log' on file 1.
BACKUP DATABASE...FILE= successfully processed 5370 pages in 0.191 seconds (219.611 MB/sec).
Processed 2840 pages for database 'TestDatabase', file 'TestDatabase_Data_1' on file 1.
Processed 2528 pages for database 'TestDatabase', file 'TestDatabase_Dat_2' on file 1.
Processed 2528 pages for database 'TestDatabase', file 'TestDatabase_Dat_3' on file 1.
Processed 2 pages for database 'TestDatabase', file 'TestDatabase_Log' on file 1.

BACKUP DATABASE...FILE= successfully processed 7898 pages in 0.271 seconds (227.674 MB/sec).

Result after applying CU2.

Processed 2872 pages for database 'TestDatabase', file 'TestDatabase_Data_1' on file 1.
Processed 2528 pages for database 'TestDatabase', file 'TestDatabase_Dat_2' on file 1.
Processed 2528 pages for database 'TestDatabase', file 'TestDatabase_Dat_3' on file 1.
Processed 3 pages for database 'TestDatabase', file 'TestDatabase_Log' on file 1.
BACKUP DATABASE successfully processed 7931 pages in 0.289 seconds (214.379 MB/sec).
Processed 2872 pages for database 'TestDatabase', file 'TestDatabase_Data_1' on file 1.
Processed 2528 pages for database 'TestDatabase', file 'TestDatabase_Dat_2' on file 1.
Processed 2 pages for database 'TestDatabase', file 'TestDatabase_Log' on file 1.
BACKUP DATABASE...FILE= successfully processed 5402 pages in 0.202 seconds (208.890 MB/sec).
Processed 2872 pages for database 'TestDatabase', file 'TestDatabase_Data_1' on file 1.
Processed 2528 pages for database 'TestDatabase', file 'TestDatabase_Dat_2' on file 1.
Processed 2 pages for database 'TestDatabase', file 'TestDatabase_Log' on file 1.
BACKUP DATABASE...FILE= successfully processed 5402 pages in 0.205 seconds (205.852 MB/sec).

Be aware on this, this may be the expected behavior of SQL Server 2016 though I see it as an issue, working on it, I will update the post if I hear something new on it.

Wednesday, October 12, 2016

Find SQL Server Databases that are not accessed after last reboot

I have written a post tilted When was my SQL Server Database last accessed? that talks about how to find the last accessed date and time of databases. As per one of the comments, it is important to find out databases that were not accessed after last reboot as well, hence wrote this query for that.

It may be helpful to you.

SELECT name DatabaseName
FROM sys.databases
WHERE name NOT IN ('master', 'tempdb', 'model', 'msdb') -- and if you other system DBs
 DB_NAME(database_id) DatabaseName
FROM sys.dm_db_index_usage_stats

Tuesday, October 11, 2016

How to find NT Service\MSSQLSERVER and NT Service\SQLSERVERAGENT accounts?

You have installed SQL Server and it is up and running without any issue. Later, you need to change some permission given to either SQL Server engine or agent. You look for accounts, searched in local users, searched in local groups but you cannot find them.

Are you experiencing above issues? If yes, first understand the what are these accounts. These are called Virtual Accounts that are created during the installation of SQL Server. These accounts are managed by the Operating System itself, hence they are not visible when you browse Local Users and Groups window. Similarly, there is another type of accounts called Managed Service Accounts that are created at domain level and assigned to SQL Server services.

Now, for some reason, if you changed the service account of your SQL Server service to another account, and later you want to use the same Virtual Account, this is what you have to do.

1. Get the properties of the services.

2. Easiest way is, just type the account and leave the password blank. If the instance is default, type it as NT Service\MSSQLSERVER or if it is a named instance, type NT Service\MSSQL$.

3. Click on to get the service restarted. It will work as you expected.

4. Or, if you want to search the account, click on Browse to open Select User or Group window. Type nt service\ms in Enter the object name to select input box and click on Check Names. If you are setting the Agent Service, look for nt service\sql word.

5. You get Multiple Names Found window opened. Select the account from the list and continue. Do not enter a password, click on OK and get the service restarted.

Just like this, if you need to add these accounts to some other groups for granting more permissions, example, adding Agent Service Account to Administrators Group (not recommended), follow the same steps.

Monday, October 10, 2016

How to enable compression on all SQL Server Database Backups

Backup compression is not something new to SQL Server, it has been there with few versions. However, many are unaware on setting the default compression on all backups, not explicitly for specific backups.

How can we instruct SQL Server to compression all backups by default?
There is a server configuration called backup compression default that can be set at the instance level. If you enable it, all backups will be compressed.

Check the following code;

-- Taking a backup before enabling compression
BACKUP DATABASE AdventureWorks2014
 TO DISK = 'D:\MSSQL13.MSSQLSERVER\MSSQL\Backup\AdventureWorks_BeforeCompressionEnabled.bak'

-- Enabling compression
EXEC sp_configure 'backup compression default', 1;

-- Taking a backup after enabling compression
BACKUP DATABASE AdventureWorks2014
 TO DISK = 'D:\MSSQL13.MSSQLSERVER\MSSQL\Backup\AdventureWorks_AfterCompressionEnabled.bak'

Sunday, October 9, 2016

Determine the backup strategy - SQL Server Brain Basher of the Week #056

Determining the backup strategy for a SQL Server database is not an easy task as there are many things to be considered. As per my experience, many implement a strategy without considering anything, example, just a daily full database backup even without asking the business requirement.

Generally, we consider two things: RTO and RPO. RTO (Recovery Time Objective) refers time it takes for recovering. In other words, how long business can survive without the system. RPO (Recovery Point Objective) refers the acceptable data lost at a recovery. In other words, business accepts to repeat the work for a certain time period.

This week question is based on backup strategies. Here is the question:

You have a database which requires a recovery (or backup) strategy based on the given requirements;
  1. Size of the database is 25 GB.
  2. The database must never be unavailable for longer than 8 hours.
  3. At the recovery from a failure, no more than 1 hour of transactions may be lost.
  4. Total backup throughput is 100 MB/minute.
  5. Total restore throughput is 80 MB/minute.
  6. Average rate of changes during office hours is 500 MB/hour.
  7. Average amount of new data is 200 MB /day.
  8. Office hours - 09:00 - 17:30.
What would be your plan the requirements given?

How do you start this. You should start with #2 and #3 because they talk about RTO and RPO. Whatever backup strategy we planned, we must make sure the restoration can be done within 6 hours. Not only that, we have to make sure that we must not lose more than 1 hour data. Okay, let's work on this.

We cannot survive without a full database backup. Let's calculate and see how much time it needs for backing up and restoring the database.

Database size = 25 GB = 25,600 MB

Backup throughput = 100 MB/minute
Time takes for backing up the database = 25600/100 = 256 minutes = 4 hours 26 minutes

Restore throughput = 80 MB/minute
Time takes for restoring the database = 25600/80 = 320 minutes = 5 hours 33 minutes

As you see, it takes about 4-5 hours for backing up and we can easily set it up during off peak hours. However, it takes 5-6 hours for restoring that is something we need to consider.

Since the RPO is 1 hour, we need to make sure that Database Recovery Model is Full and it needs Transaction Log backup every hour. Let's calculate the time it needs.

Changes per hour - 500 MB
Time takes for backing up the log = 500/100 = 5 minutes
Time takes for restoring the log = 500/80 = 7 minutes

Let's see whether we can have strategy like - Daily full backup and hourly Transnational Log Backup during office hours. If we plan the full backup at 00:00, it can complete it before 05:00. And we can have hourly Transnational Log Backup starting at 10:00 and ending at 18:00. In worse case, if something happens at 17:30, we have to restore the full backup and 8 log backups. Let's calculate the time and see.

Restoration time for the full database backup = 5 hours 33 minutes
Restoration time for log backups = 7 x 8 = 56 minutes
Total duration = 5 hours 33 minutes + 56 minutes =  6 hours and 29 minutes.

Since we have been given the RTO as 8 hours, the strategy we checked works fine with it. However, we need to think about the growth as well. We add approximately 200MB daily, approximately 1GB weekly. This increases the time it takes for full database backup. Assuming that database becomes 30 GB after a month;

Time takes for restoring the database = 30720/80 = 384 minutes = 6 hours 24 minutes.

If we recalculate the time;
Total duration = 6 hours 24 minutes + 56 minutes =  7 hours and 20 minutes.

Now this has to come to closer to the given RTO. As you see, within few months, this strategy will not be able to handle the requirements. If you see something like this, then you cannot handle it with traditional backup types, you might have to go for either disk level backups or partial backups. Let's talk about more on that with another post.

Saturday, October 8, 2016

How to create a SQL Server job quickly

SQL Server Agent Jobs are commonly used for automating administrative tasks and other routine tasks. For creating a job as you need, you need to either use GUI that comes with SSMS or use stored procedures given. Yes, it takes time for adding a job and then adding steps but we can get some jobs automatically created using a simple way.

Assume that you need to create an Agent job for backing up one of your databases. Rather going through SQL Server Agent -> Job in the Object Explorer, you can open the Backup Interface just like the way you manually take backup, configure everything as you want. Once the configuration is done, without clicking OK, click on down-arrow next to Script Button and click on Script Action to Job.

This action opens a window that is used for creating Agent Jobs. Good thing is, it adds all we need with Steps and all you have to add is a schedule.

Using this technique, jobs can be easily created for most of the tasks.

Friday, October 7, 2016

How to check success and failure logins attepmts in Azure SQL Database

Who tried to access my Azure SQL Database? Were they succeeded? Can I get more details on attempts?

If you are maintaining a Azure SQL Database, you should be asking these questions yourself. Or, practically, you need to have these questions in order to protect your database. Generally, with on-premise instance, we get these information using either Audit or Logon Triggers. Can we get the same from Azure SQL Database?

Azure SQL Database has Auditing that can be easily enabled via the portal. It does not support Logon Triggers and notifications cannot be sent via emails as it does not support Database Mail. However Audit captures everything you need and there are few ways of seeing them.

In order to see required information for my databases, I enabled Audit and then use Azure Storage Explorer to see all details because it is the easiest way. If you need the same, here is the way of doing it;

1. Open the Azure Web Portal and open SQL Server blade.
2. Click on the server you need to audit and select Auditing and Threat detection.
3. Enable Auditing. Select Table as Auditing type because it is easy to read.
4. Create a new storage for storing audit records.
5. Set the Retention (Days) as you need and set the Table name as you like.

6. Select Audited Events and check both Success and Failure check-boxes.
7. Select all Threat Detection Types.

8. Click on Save to save all settings.

As I mentioned above, there are multiple ways of getting audit records to our environment. Let's use Azure Storage Explorer and see how we can see them. Let's see how we can use other channels like Power BI with another post later.

1. Open Azure Storage Explorer and configure the connection for the storage you use. Read more on this at:
2. You should be able to see all required details related to login attempts.

Thursday, October 6, 2016

What is a runaway query and how to handle it with SQL Server

Initially I thought that runaway query is another name for ad-hoc queries but it is NOT. It actually refers long running queries that takes long time than you expected or previously executed with a shorter period but not takes long time.

* image taken from

There can be many reasons for runaway queries. Poor statistics, poorly written codes, blocking and locking or even heavy resource consumption can be some of the reasons for this. Example, a report generally takes 1-2 minutes to open suddenly takes 15-30 minutes. Or a user runs a query expecting the result in few minutes but takes hours.

How can we handle this with SQL Sever?
You can monitor them easily and stop the execution by killing the process. You can use Activity Monitor (Active Expensive Queries window - Elapsed Time column) or sys.dm_exec_query_stats dynamic management view (see this: However, rather killing the session, it is advisable (or always better) for detecting runaway queries and allocate limit resource consumption that makes sure other standard operations are not disturbed. Even though you cannot fully handle it, the Resource Governor can handle this up to some extent.

Remember, Resource Governor does not detect runaway queries. It can only limit resources for requests, identifying certain properties of the request such as application name, user name, etc. Example, if you want to limit resources from one of the reporting applications that has lengthy reports and requests coming from it have no higher priority, application name can be tracked via a function called Classifier User-Defined Function which we specifically write for implementing Resource Governor and assign the request to correct workload group that uses least resources. Read following articles for understanding Resource Governor and how to use it for limiting resources.

Wednesday, October 5, 2016

How to use SQL Server templates?

Unfortunately, many developers are unaware of available templates. SQL Server provides templates for almost all operations you can perform with relational databases and multidimensional databases, hence you do not need to remember all commands and the way code should be written.

For example, if you need to create a Linked Server object to an external database, you can simply take the template available and modify with values related.

You can find all templates using Template Explorer. Use either View -> Template Explorer menu or press Ctrl+Alt+T for viewing templates.

Expand the relevant node you look for and double click the template. The below image shows the template Add Linked Server Access MDB opened.

You can easily change values related by pressing Ctrl+Shift+M. It opens a window that you can use for setting values.

Tuesday, October 4, 2016

What are dbmanager and loginmanager database roles in Azure SQL Database instance

Have you noticed two new database roles appeared in master database? You might have, you might have not, because these two appear only with Azure SQL Database instance. What are they and what is the usage of them?

dbmanager database role
Azure SQL Database instance does not offer any server level roles and no user can be created similar to sa that we have seen with on-premise instances. Since we do not have administrative control over Azure SQL Database instance, the dbmanager role is given to us inside the virtual master database. Generally, when the instance is created using the portal, the account added becomes the admin of the instance and account will be added to the master database. If you need more administrators who can create databases, then you need to create a new user and add the user to this role.

Here are the steps. You need to make sure that you are connected with master database.

ALTER ROLE dbmanager ADD MEMBER Jane; 

dbmanager database role
Just like the above one, if you need users who can perform operations related to users such as create, alter user account, create a user and add him to this role.

ALTER ROLE loginmanager ADD MEMBER Jack; 

However, this is not much required and important with Azure SQL Databases as almost all databases are treated as Contained Databases. Therefore required permissions can be simply granted to user as below without creating user accounts in master and adding them to loginmanager role.


Monday, October 3, 2016

SQL Server Data Masking - Filtering masked values

SQL Server 2016 introduced Dynamic Data Masking for obfuscating certain values in records, not exactly as a security measure but for hiding sensitive data. While this was being discussed, a question raised: Can we filter data even when data is masked and no permission to unmask?

Generally, unless you are a member of db_owner role, you cannot read masked data. If need, you should be granted with UNMASK permission. Now can you masked filter data? Can you group masked data? Can you do them even if you have no permission for unmasking?

Yes, it is possible. Let's have a look on below code;

-- Create a test database

-- Connect with it
USE Sales;

-- Create a sample table with masked columns
CREATE TABLE dbo.RetailCustomer
 RetailCustomerId int identity(1,1) PRIMARY KEY
 , Name varchar(100) NOT NULL
 , EmailAddress varchar(100) MASKED WITH (FUNCTION = 'email()') NOT NULL
 , CreditLimit decimal(16,2) MASKED WITH (FUNCTION = 'random(1, 10000)') NOT NULL
 , SecurityTokenCode char(10) MASKED WITH (FUNCTION = 'partial(2, "xxxx", 0)') NOT NULL

-- Insert set of records
INSERT INTO dbo.RetailCustomer
 (Name, EmailAddress, CreditLimit, SecurityTokenCode)
 ('Customer1', '', 10000, 'EX12345678')
 , ('Customer2', '', 120000, 'EX11223344')
 , ('Customer3', '', 8000, 'TX11223344')

-- Checking records;
SELECT * FROM dbo.RetailCustomer;

As you see, I do not see data as masked data because I am a member of db_owner role. But if you check with different account, he will see masked data. See the code below;

-- Creating a user

-- Grant read permission to Peter
-- Note that this does not mean that he can unmask
GRANT SELECT ON dbo.RetailCustomer TO Peter;

-- Check with his account
SELECT * FROM dbo.RetailCustomer

We can grant unmask permission to Peter but let's see whether he can do some aggregation and filtering with masked data.

-- Can he aggregate masked values
SELECT SUM(CreditLimit) FROM dbo.RetailCustomer;

-- Can he performed operations against masked values
SELECT LEN(EmailAddress) FROM dbo.RetailCustomer;
SELECT LEFT(EmailAddress, 2) FROM dbo.RetailCustomer;

-- Can he filer records with masked columns
SELECT * FROM dbo.RetailCustomer WHERE CreditLimit = 120000
SELECT * FROM dbo.RetailCustomer WHERE EmailAddress = '';


See the result. He cannot do some operations but he can do filtering without any issue. Looks unsecured? Do not worry, that is why I mentioned that Dynamic Data Masking is not for securing or encrypting data, it is for obfuscating values.

Sunday, October 2, 2016

SQL Server Brain Basher of the Week #055 - Security

You are the Database Administrator and you maintain a database with following characteristics;

  • Database contains sensitive data.
  • Database authentication is set to Mixed Mode.
  • Users can access only tables they are authorized. Authorization is completely handled with SQL Server.
  • Tables related to finance data can be accessed only by users who are in Finance User-Defined Database Role.

Sales team uses their own application to access the database. There is a new requirement from the Sales team as below;
  • Sales team needs to see some financial data through the application.
  • Sales team members should NOT be able to execute impromptu SELECT, INSERT, UPDATE, and DELETE statements against financial tables.
  • The application will prompt for the second password when the screen related to finance data is opened. Sales person needs supervisor to input the second password for seeing data.
Here is the question of the week;

Being a DBA, what is your suggestion to achieve above requirement?

There are couple of things you need to consider. First, although users access the database via applications, users credentials are used for authorization, means users need permissions on finance data in order to access them. But, secondly, if users are granted permissions, they will be able to execute impromptu queries against tables, which should not be avoided. How can you grant permission to these specific tables without giving explicit permissions to users or roles?

If you are a smart DBA, now you should think about Application Roles

An Application Role is a database level role that permissions can be set with an application via the role, without setting permissions to users or other roles. You create it with a password, grant permissions as needed, in this case, let's say two tables related to finance, and then open the Application Role when two tables are required to access.

Since Supervisor enters the password, it can be used as the Application Role password. Once the application accepts the password, it calls a special stored procedure called sp_setapprole with the name of the Application Role and password. This creates New Security Context for the application connection and current user's context becomes inactive. User's security context will only be back once the sp_unsetapprole is called, until that, only Application Role and Public tokens are available for accessing resources.

Note that once the Application Role is active, user can only access other databases via guest account only.

For more info on Application Role, read this:

Saturday, October 1, 2016

SQL Server Audit shows wrong data and time?

It was bit funny, could not figure out immediately what has happened, the records related to Audit are saved with wrong data and time.

SQL Server Audit is a the primary auditing tool in SQL Server and it is based on event-driven monitoring engine called Extended Events. It is used for tracking actions performed against the database and server instance, mainly for auditing.

I have written three posts on it, have a look on if you are new to SQL Server Audit;

While discussing the topic, we witnessed that SQL Server has saved audit records with wrong date and time. It took few minutes to realize that it is NOT wrong at all, records are save with UTC time standard. If you need to get the write data and time, you need to covert it to the time based on your time zone.

In my case, in order to see the right date and time for my audit records, I have to convert the UTC to +05:30 time zone like below.

 convert(datetime, SWITCHOFFSET(event_time, '+05:30')) event_time
 , object_name, user_defined_event_id, statement, user_defined_information
FROM sys.fn_get_audit_file ('c:\temp\*',default,default);

Now the question is, how to see the same if Audit is recorded with Windows Application Log or Security Log. Not sure about the way but will check and see. If you have already found a solution, please share.

Friday, September 30, 2016

How to check my permission on SQL Server object

How do you check and see whether you have permission on a specific object and what permission types you have been granted? This requirement comes whenever you experience a permission issue and it might not be easy to get relevant information from relevant department or person. However, SQL Server offers a function for us: fn_my_permissions. This function can be used for seeing all permissions you have been granted on a particular object;

SELECT * FROM fn_my_permissions (NULL, 'DATABASE');
USE AdventureWorks2014; GO
SELECT * FROM fn_my_permissions('Sales.SalesOrderHeader', 'OBJECT') ;

This function accepts securable and securable class. There are multiple classes such as DATABASE, SCHEMA, OBJECT etc.

Thursday, September 29, 2016

How to configure SQL Database Mail to send emails using Office 365

I have been using SQL Server Database Mail for sending mails and have used various email providers. However, while troubleshooting and helping a client, I found that it does not work with Office 365 as expected and the configuration is not same. I read many posts, contacted fellow MVPs, and of course, with guidelines, finally I configured it with my machine.

We initially tried this with one of client's servers, since it did not work with default configurations, I tried with my machine which has Windows 8.1.

If you are trying to configure Office 365 for SQL Server Database Mail, there are couple of things you need to consider;
  • .Net Framework 3.5
  • SSL and TLS configuration
  • Patches need for TLS support on SQL Server
Net Framework 3.5
You need to make sure that .Net Framework 3.5 is enabled in the OS because SQL Server Database Mail works with 3.5. If you have not enabled it, then enable.

SSL and TLS configuration
Remember, Office 365 uses TLS, not SSL. This is the biggest issue for the SQL Server. Microsoft SQL Server 2016 is shipped with necessary components related to TLS 1.2 but you need to disable SSL 3.0 if you have already configured. In most cases, you need to disable SSL for the server but you can try first disabling SSL only for client software. However, in my case, I did not have SSL 3.0, hence I did not disable anything. If you have, follow the below URL for getting instructions for disabling SSL 3.0. But, fist try without disabling.

Patches need for TLS support on SQL Server
Then you need to make sure required patches are installed. Since I use SQL Server 2016, I did not want to install anything on SQL Server but I had to install a patch to the OS. All patches related to this are listed here:

Once everything required is done, I configured my Database mail again;

And yes, it worked without any issue.

EXEC msdb.dbo.sp_send_dbmail 
 @profile_name = 'Main'
 , @recipients = ''
 , @body = 'Test'
 , @subject = 'Test';

SELECT * FROM msdb.dbo.sysmail_allitems;
SELECT * FROM msdb.dbo.sysmail_sentitems;
SELECT * FROM msdb.dbo.sysmail_unsentitems;
SELECT * FROM msdb.dbo.sysmail_faileditems;

Wednesday, September 28, 2016

How to get the most accessed tables from SQL Server database

While discussing with one of my clients, this question raised but I knew that there is no direct way unless we use something related to indexes. So the query is simple, all we have to do is, get the total count of all scans, seeks and lookups for particular indexes;

SELECT AS TableName,
 SUM(u.user_seeks + u.user_scans + u.user_lookups) AS NumberOfTimeAccessed
FROM sys.dm_db_index_usage_stats u
 INNER JOIN sys.tables t 
  ON t.object_id = u.object_id
WHERE database_id = DB_ID()
ORDER BY SUM(u.user_seeks + u.user_scans + u.user_lookups) DESC

Is this 100% accurate? As per my understanding, this is not 100% accurate. You might get larger numbers when you have multiple indexes but if you all want is to know which tables get more hit, this does a reasonable job.

Does this capture the tables that are based on heap structure? Yes of course, still you see the count on heap tables too.

Tuesday, September 27, 2016

How many times my SQL Server files have grown?

If you have not changed the default Growth Rate settings in SQL Server files, then the default will be applied. SQL Server 2016 is smart, it configures the Growth Rate as 64MB but old versions set this to very low value, resulting too many smaller growth. This slows down the system because the cost of growing a file is very expensive.

How do you check whether the your SQL Server experience too many growths? It is simple with standard reports given. The report Disk Usage shows all occurring of file growth if SQL Server has grown the files since last service restart.

This is what you see.

If you see many growth within a shorter period, you need to change the Growth Rate immediately.

Monday, September 26, 2016

Data Science - Type of Analytics

While discussing with few data enthusiasts, the topic Type of Data Analytics came up and various types were discussed. There were few arguments on categorization but it is clear that the types of Data Analytics are vary and can be defined with many classifications.

However, if someone needs to understand main types of it, what would be the way of categorizing it. As per the discussion we had and the articles I have read, I believe that it can be divided into three;

If we are to summarize this;

Descriptive Analytics is something we always do, we always implement. We try to see the insight based on historical data. You might be analyzing last 6 months supermarket bills and try understand how you have spent, whether you have spent on unnecessary items, whether your spent is worth or not, etc. The owner of the supermarket would do the same. He will check and see how his customers have bought items, any relationships among purchased items, which gives more profits, which do not move, etc. Generally this is the first part of any Business Intelligence implementation and this is what widely used by many users.

Predictive Analytics helps us to see the future. This is also based on historical data but it actually forecasts values for future. This analytics uses historical data along with statistics related to all entities. This is not only for seeing values, it can be used for other processes related to the business. Example, we can use this analytics for finding associations between products and keep those products closer to increase the sale. That is why we see some different names on this such as Market Basket Analytics, Association.

Prescriptive Analytics combination of both above items but it shows few things additionally. It tells you what might happen and what should be done. This analytics is bit rare to see with standard BI applications but many try to implement it. This is referred as Describe - Predict - Prescribe

Sunday, September 25, 2016

SQL Server Brain Basher of the Week #054 - Guest Account

Here is the question of the week. I am sure that you have seen guest user account in your databases. This week question is based on it;

What is Guest User Account and usage of it?

The Guest User Account is a system user account that is available with all user defined databases and it cannot be dropped. By default, it is disabled, it can be enabled using following code.


This user account has no relevant Login Account. SQL Server uses this account when;
  • User connects with SQL Server using the Login and tries to access a database that he has no User account created.
  • Guest User Account is enabled.
In a way, this is good because this allows users to connect with SQL Server using Logins and accesses databases without creating Users but it might introduce some security threats as well. Therefore, it is better to keep it disabled. If you need to enable it, make sure it has least privileges on resources in the database.

Saturday, September 24, 2016

Are you experiencing "Double-Hop" problem with SQL Server?

Without noticing, you may experience double-hop problem with your SQL Server and you may be clueless on it. With this, all you see is, some requests made to SQL Server is not getting satisfied and getting permission related to errors. Let's try to understand what is double-hop and how it can be sorted out with SQL Server.

See the below diagram;

The diagram shows that how user connects with the database using the application. He sends a request via the application and application server satisfies his request using the database hosted in the same machine. In this case, application needs to use user's credentials for connecting with SQL Server hosted is Server 01 and it impersonates the user's account for that. It is the first hop and it works fine.

Now the second hop. SQL Server 01 needs some info in SQL Server 02, hence it forwards to the request with user's credentials but, by default, environment does not allow it. SQL Server 01 has no way of forwarding user's credentials to Server 02 and you will get an error.

How to sort this out?
This requires Delegation to be enabled. Delegation allows Server 01 to forward user's credentials to Server 02 and then Server 02 can execute the code as a distributed query.

You need following to be enabled/configured for Delegation.
  • User must have permission on both Server 01 and Server 02.
  • Either TCP/IP or Named Pipes must be the protocol for accessing SQL Server.
  • The Active Directory property called Account is sensitive and cannot be delegated must not be checked for the user's account.
  • Both Server 01 and Server 02 require registered Service Principal Number (SPN).
  • SQL Server services accounts must be trusted for Delegation.
  • SQL Server 02 must be configured as a Linked Server in Server 01.

Friday, September 23, 2016

SQL Server 2016 - New Releases

There are few new releases related to Microsoft SQL Server in last few days. Here is the list, get them downloaded as you need and install.

Cumulative Update #2 for SQL Server 2016 RTM

SQL Server Management Studio -16.4.1 Release

SQL Server Data Tools 16.4 Release