Monday, September 1, 2014

Cumulative Update 1 for SQL Server 2012 SP2

Microsoft has released Cumulative Update 1 for SQL Server 2012 Service Pack 2 and it is available for downloading.

Here is the link for downloading:

Here is the link for the list of bugs fixed:

Read this for understanding versions and service packs released for all SQL Server versions:

SQL Server 2014 Sample Databases and applications

Are you looking for SQL Server 2014 AdventureWorks databases and applications? It is now available and here is the link:

Tuesday, June 24, 2014

What if SQL Server Service is paused?

If you have gone through Windows services, specifically SQL Server services, either through SQL Server Configurations manager or Windows Services, you might have noticed that some services can be paused in addition to stopped or restarted. What if the service is paused?

If you pause the SQL Server service, there will not be any issue for existing connection but new connections will not be allowed. Users who have been working with databases can continue until they disconnect the connection. This is generally useful when you need to shutdown the service without disturbing existing users’ activities. Since this does not allow new connection, no user can establish a new connection and once all existing connections are disconnected, server can be safely shutdown.

Here is an example;

User makes a connection and runs a query:

Administrator pauses the service:

User still runs queries:

New user cannot establish the connection:

Monday, June 23, 2014

Service Pack 2 is available for SQL Server 2012

Microsoft has released SQL Server 2012 Service Pack 2 and it is available for downloading.

Here is the link for downloading:

Here is the link for the list of bugs fixed:

Read this for understanding versions and service packs released for all SQL Server versions:

Tuesday, June 17, 2014

What are the advantages of a SQL Server database with multiple files located on one physical drive ?

This is a common question I always get from my classes/workshops. Do we really get some benefits out of a database with multiple data files on one physical drive? Obviously, no performance benefits, but in certain situation, it gives you some benefits;

  1. Taking partial backups, file backups, file group backups
    Although the files have not been distributed on multiple physical drives, still partial backups, files backups and file-groups backups are possible. Assume that your database is very large and some tables are infrequently updated, then you can separate tables into two file groups, using multiple data files, one group for tables that are frequently updated and other for tables that are infrequently updated (or read-only tables). This allows you to have a time-saving and efficient backup strategy, setting different backup plans on two file groups.
  2. Restoration into smaller multiple physical drives
    Assume that you maintain a database with single 100GB size data file. What if the exiting drive crashes and you get two 50GB physical drives for restoring the database from one of the backups? As you know, it is not possible, but, if the database is maintained with multiple data files even with a single physical drive, scenario like this can be easily addressed.
  3. Moving files into another drive
    For various reasons, either the entire database or part of the database is required to be moved to a different physical drives. Some of the reasons could be, moving from test environment to production environment, moving some file-groups to optimized physical drives, moving entire database to a different location. If the database is constructed with multiple files, moving files for above scenario can be easily done with less effort.

MDX: Getting name of the user connected

Are you looking for a way of adding the name of the account used for the session to the resultset? Here is the way of adding it;


Thursday, June 12, 2014

What is “sysname” data type used for object names?

You might have noticed that the data type of object names such as database name is “sysname” which is not listed in data type list when creating a table via table designer. This data type can be clearly seen when creating objects using templates (CREATE DATABASE template). What is this sysname data type? Can we use it? Should we use it?

Sysname is a built-in system data type defined as nvarchar(128). Initially, with older versions (up to 6.5 if I am not mistaken, TBV) it was equivalent to varchar(30). It is a non-nullable data type and mainly used to store object names. This data type is not listed in data-type list which comes when creating tables via table designer. However, it can be used for defining columns for tables when they are created using T-SQL. Have a look on following code;

-- using sysname for variables
DECLARE @TestVariable sysname = 'Test Value'
SELECT @TestVariable
-- using sysname for creating tables
CREATE TABLE dbo.TestTable
    Id int PRIMARY KEY
    , Name sysname

If your system is configured with case-sensitive collation, make sure you type the word in lowercase.

As you see, it can be used with our codes. However, many recommend not to use because of; 1. nvarchar(128) can be simply used instead, 2. there can be a change in future version. Therefore, even though it is possible to use sysname data type with T-SQL, it is better not to use.

Connecting with Windows Azure SQL Server database via Power Pivot

While trying to get some records from a database configured in a Windows Azure machine (NOT SQL AZURE), I noticed that the connection can be made using Power Query but Power Pivot. Power Query has an option for connecting with Windows Azure Database using “From Windows Azure SQL Database” as shown in below images;



However, there is no such option with Power Pivot. If you try to connect using “From SQL Server”, you will get an error as below;



Unable to retrieve list of databases. Reason: Failed to connect to the server. Reason: Login timeout expired
A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.
Named Pipes Provider: Could not open a connection to SQL Server [53].

The error indicates that Excel tries to establish the connecting using Named Pipes protocol which is not valid with this context. Though it can be considered as the reason for this error, there is no option for changing the protocol as TCP/IP. This makes establishing the connection impossible. However, found a workaround for connecting with Windows Azure SQL Database, not using the default Provider which is “SQL Server Native Client 11.0” but using “Microsoft OLEDB Provider for SQL Server”. All we have to do is, open the Advanced window clicking “Advanced” button and change the Provider as Microsoft OLEDB Provider for SQL Server.


I still cannot figure out the exact reason for this and how to connect using SQL Server Native Client. If you have done it, please share.

Related post:

Tuesday, June 10, 2014

SQL Server Installation: Are Shared components getting upgraded with Side-by-side Upgrade?

As you know, some of the components such as Management Studio (SSMS) and Configuration Manager are shared components of SQL Server instances and they are upgraded when an in-place upgrade is performed. However, they are not supposed to get upgraded with Side-by-side upgrade unless a certain criteria is met.

In-place and Side-by-side upgrades
In-place upgrades refers an installation of newer version of SQL Server that replaces an existing instance of SQL Server completely, including shared components. The key advantage with this is lesser requirements for additional hardware resources and the key disadvantage is incapability of rolling back at a failure of an installation. Side-by-side upgrades refers an installation of newer version of SQL Server as a new instance without replacing the older version of SQL Server installed. This is the most safest way of upgrading a system because it reduces the risk associated with upgrades leaving the original system in place and intact. However, there are few disadvantages with this, such as copying user defined databases manually, configuring security and configuring jobs used manually.

Side-by-side upgrades does not suppose to replace shared components of older version installed. However a replacement can be happened if the major build number of old instance and the major build number of the new instance are identical. This condition is rare to be occurred but this has been seen with SQL Server 2008 and 2008 R2. If you perform an installation of SQL Server 2008 R2 (version is 10.50) as a side-by-side upgrade for SQL Server 2008 (version is 10.00), though it is not an in-place upgrade, new installation will replace all shared components of SQL Server 2008 instance.

Monday, June 9, 2014

What is the fastest/quickest way of getting the total number of records (COUNT) in a SQL Server table?

Are you trying to get the total number of records in one of your tables? If yes, what is the best, or what is the quickest way of getting the number?

Generally we use COUNT aggregate function for retrieving the record count from the table. This does not take much time if the table contains low number of records. However, when COUNT is run against a table that contains millions of records, it can take a noticeable time. In a situation like that, if the table is a clustered table, the required result can be retrieved with lesser time using sys.indexes system catalog view.

The following query runs against a tables called Fact.Sales in ContosoRetailDW database (one of Microsoft sample database). This table is clustered table and it contains 3.4 millions records. Have a look on the query and the result;

USE [ContosoRetailDW]
DECLARE @CurrentTime AS datetime = GetDate()
SELECT Count(*) NumberOfRecords 
FROM dbo.FactSales;
SELECT Cast(DateDiff(SECOND, @CurrentTime, GetDate()) AS varchar)
        + ','
        + Cast(DateDiff(MILLISECOND, @CurrentTime, GetDate()) AS varchar)
        + ','
        + Cast(DateDiff(MICROSECOND, @CurrentTime, GetDate()) AS varchar)
    AS ExecutionTime
DECLARE @CurrentTime AS datetime = GetDate()
SELECT [rows] NumberOfRecords 
FROM sys.sysindexes 
WHERE id = OBJECT_ID('dbo.FactSales') and indid =1 
SELECT Cast(DateDiff(SECOND, @CurrentTime, GetDate()) AS varchar)
        + ','
        + Cast(DateDiff(MILLISECOND, @CurrentTime, GetDate()) AS varchar)
        + ','
        + Cast(DateDiff(MICROSECOND, @CurrentTime, GetDate()) AS varchar)
    AS ExecutionTime


As you see, the second method produces the result less than one microsecond while first method takes about 76 milliseconds. This proves that querying sys.indexes is more efficient than COUNT table when querying a clustered table. However, there is chance of seeing an inaccurate number with this method if statistics are not updated. In addition to that, if the table is a heap table, same performance may not be seen with second method.

Friday, June 6, 2014

Connecting with SQL Server virtual machine on Azure using Management Studio

If you have a requirement for connecting with SQL Serer virtual machine configured on Windows Azure, there are few things to be done. The following link explains step-by-step for getting the necessary tasks done for connecting via local Management Studio.

Connection I

Tuesday, June 3, 2014

What is LRU (Last Recent Used) algorithm?

The LRU, or Last Recent Used algorithm determines the data pages to be dropped from the cache when Data Cache section, which is the largest part of the Buffer Pool needs space.

Data Cache section is used for caching database data pages. In order to satisfy users’ requests, SQL Server loads data pages to this section and then fulfills the requirements such as SELECT, INSERT, UPDATE and DELETE. No modifications are performed on the data files directly. Once pages are cached, other than the initial request, other requests are satisfied using cached pages too.

In addition to Data Cache section, Buffer Pool comprises two more sections; Free Pages and Stolen Pages. First section contains pages that are not used yet and are kept to handle new requests. Second section is for pages that are used by other SQL Server components.

The LRU policy determines the timeframe for data pages to be stayed in the Data Cache after they have been flushed to the disk by Checkpoint process. The process that drops pages from the cache is called as Lazy Writer. The determination of dropping pages are done based on a counter maintained on the page header, that is decreased if the page has not been scanned for a while. Pages with lowest counter are always dropped by Lazy Writer.

Maximum amount of memory to SQL Server is 8TB, is it really consuming 8TB?

“As per MSDN, the maximum memory can be consumed by SQL Server is 8TB, Will SQL Server surely use 8TB if the box is configured with 8TB?”. This was a question raised by an enthusiast.

“It will, but currently it is impossible”. It was my answer. Do you agree with me? Let me explain.

Virtual Address Memory, which is called as VAS refers the total amount of memory can be used by a Windows process or an application such as SQL Server. The size of the VAS varies according the to processor architecture (32-bit or 64-bit) and the edition of the operating system. If the processor architecture is 32-bit, VAS is 4GB. Under standard installation, 2GB is reserved for the system (kernel mode address space) and 2GB is reserved for the application (user mode address space). This proportion can be changed by adding a /3GB switch to the boot.ini for setting 3GB for user mode address space.

This is bit different when a 32-bit version of SQL Server is installed on 64-bit version of Windows. This installation provides full 4GB to SQL Server using WOW emulation technology.

The amount “8TB” comes with 64-bit processor architecture. This architecture offers full 8TB for applications, hence 64-bit version of SQL Server running on 64-bit Windows should use 8TB without any issue. However, this is not possible yet because of unavailability of Windows operating system that supports 8TB. The maximum amount of RAM supported by Windows 2012 is 4TB. Therefore, we need to wait for seeing SQL Server using full 8TB for user address space :).

Monday, May 19, 2014

Power BI – Power Query Update

Couple of new features have been added to Power Query and they are available with the latest version of Power Query: 2.11.3625.

Here is the download link for Power Query:

New features added;

  • New Data Sources
  • Improvements to Query Load Settings
  • Improvements to Query Refresh behaviors in Excel
  • New Transformations available in the Query Editor
  • Other Usability Improvements

This blog post describes updates available with new version:

Thursday, May 15, 2014

SQL Server Data Tools (SSDT) / Business Intelligence Studio (BIDS) for 2014

Are you searching for SQL Server Data Tools or Business Intelligence Development Studio after installing SQL Server 2014? If so, you are reading the right post.

SQL Server Data Tools or Business Intelligence Development Studio is used for creating projects on Integration Services, Analysis Services, Reporting Services, etc. Usually this gets automatically installed when SQL Server is installed but not with SQL Server 2014. This needs to be separately downloaded and installed.

This is basically a set of Visual Studio project templates that allows us to work with Microsoft SQL Server Business intelligence product suite. With SQL Server 2008 and 2008 R2, it was named as Business Intelligence Development Studio (BIDS) and with SQL Server 2012, Microsoft named it as SQL Server Data tools (SSDT). However, confusing us, there was another tool available with the same name which was used for creating database projects with Visual Studio. In order to clear the confusion, with SQL Server 2014, it has been named as Microsoft SQL Server Data Tools – Business Intelligence for Visual Studio (SSDTBI).

You can download Microsoft SQL Server Data Tools – Business Intelligence for Visual Studio 2012 from here:

This is still not ready for Visual Studio 2013. Microsoft initially made it available but the release has been called off due to some side by side issues with SSDT.
(Updated 02-Jun-2014). Microsoft SQL Server Data Tools – Business Intelligence for Visual Studio 2013 is now available. Read more on this from:

Sunday, May 4, 2014

Cumulative Update packages for SQL Server 2008 R2, 2012, and 2014

Cumulative update #12 is available for SQL Server 2008 R2 SP2. Refer the following link for downloading it and understanding the fixes done.

Cumulative update #9 is available for SQL Server 2012 SP1. Refer the following link for downloading it and understanding the fixes done.

Cumulative update #1 is available for SQL Server 2014. Refer the following link for downloading it and understanding the fixes done.

For more info on SQL Server versions and service packs, refer:

Sunday, April 6, 2014

Sri Lanka crowned T20 champs - 2014

We did it again! Thanks Sanga, Mahela and the team, proud to be a Sri Lankan.

Reasons for adding a surrogate key for dimension tables

As a best practice, we always add a new key, known as a surrogate key to dimension tables for identifying the records uniquely. However, generally, the unique business key becomes the primary key of relational tables related to business entities but once the surrogate key is added to dimension tables, we make it as the primary key without reusing the business key. Here is an example from AdventureWorksDW data warehouse.


Surrogate key is an integer that is assigned sequentially when records are added. There are many other names for this such as artificial key, synthetic key, meaningless key, integer key or non-natural key. Do we really need this? Does it seem inappropriate using the business keys for primary keys? It may seem sensible for reusing, but here are some valid reasons for not using business keys but using surrogate keys;

  1. Dimension is populated from multiple sources
    In many cases, dimensions are populated from multiple sources. This always introduces incompatibility between business keys’ data types originating from sources. Not only that, the uniqueness of the business key is not always guaranteed too.
  2. Business key is reassigned when an object becomes obsolete
    OLTP systems always focus on current operations giving less importance to the history. If an object becomes inactive or obsolete (Example: Organization decides to discontinue a product), there is a high probability to get the business key used assigned to a new object. If the business key is used as the key of the dimension, this becomes an issue because dimension already holds an object with the same business key.
  3. Business key is a complex string or GUID
    Business keys used in source systems could be formed combining numbers and different characters or with GUIDs. In such scenario, there is no harm of using the business key as the primary key of the dimension table as long as the uniqueness is guaranteed. However, in terms of performance, it makes more effective to have an integer surrogate key than a complex string.
  4. Integer improves the performance
    As mentioned with reason 3, it is always better to have an integer key as the key of the dimension even though the business key is a simple string. This improves the efficiency of the table and speeds up the data retrieval when joining with facts.
  5. Objects without business keys
    When populating dimension records, there could be a situation that object needs to be populated without a business key. Good example on this is, holding a newly introduced product in the Product dimension, before extracting it from the source, for supporting a transaction exist in the fact table. Another example is, holding an anonymous customer in the Customer dimension for supporting promotion events recorded in the fact table. It is become possible for holding these objects in dimension tables as “Unknown”, “Not applicable”, or “Anonymous” when surrogate key is used.
  6. Objects are versioned in dimension tables
    In order to maintain the history in dimension tables, same objects are versioned by adding multiple records (These dimensions are called as Slowly Changing Dimension Type 2). Example: Customer is recorded with his country Sri Lanka. Later customer becomes a citizen of Australia. In OLTP systems, it is just a change for country column. But in data warehousing, for maintaining the history for events recorded (facts), a new version (new record) of the customer is added with Australia for the country. This is not possible if the business key of the customer is used as the key of the dimension.

Of course, the maintenance cost gets increased and a mechanism has to be introduced for generating surrogate keys and linking them with foreign keys in fact tables. But it is not as difficult as many think because there are many built-in facilities available in DBMS and ETL systems.

Saturday, April 5, 2014

Free eBook: Introducing Microsoft SQL Server 2014

Want to know some major features added to SQL Server 2014? Here is an easy way. This book explains how SQL Server 2014 leverages in-memory technology for improving the performance of OLTP and data warehousing solutions, how easily on-premises solutions can be transferred to cloud with added support for hybrid environment, etc. If you are a DBA, DBE or an engineer who wants to know the new features and capabilities, this is the book to be read.

Download links:

Chapters in this book;

  • Chapter 1: SQL Server 2014 editions and engine enhancements
  • Chapter 2: In-Memory OLTP investments
  • Chapter 3: High-availability, hybrid-cloud, and backup enhancements
  • Chapter 4: Exploring self-service BI in Microsoft Excel 2013
  • Chapter 5: Introducing Power BI for Office 365
  • Chapter 6: Big data solutions

Wednesday, April 2, 2014

Easy way of adding your Microsoft Certification to LinkedIn Profile

If you are maintaining your profile with world largest professional network (Im sure that you are) and you have done a Microsoft Certification, there is an easy way of adding the certification to your LinkedIn profile rather than manually adding it. The new way has been introduced by teaming up with LinkedIn and it is opened to you via an email which you receive with details once an exam is done. All you have to do is, hit the button populated with all details required (since I have not done a certification recently, I have no idea how this will be appeared in the mail, I believe it is similar to below image).

In order to receive the mail, make sure you have enabled “Promotional email” in MCP profile setting and you are using the same email address for both MCP profile and LinkedIn profile.

For more details, read this post published by Carrie Francey (MS):