Wednesday, September 30, 2015

SQL Server 2014 and 2012 new releases

Microsoft SQL Server team has done few releases related SQL Server 2014 and 2012 in last few months. You can download and upgrade your instances if required;

If you want to understand about Cumulative Updates, please read this;
What are hotfixes, Cumulative Updates and Service Packs? 


For all update releases;

Version Code Name Initial Version Updates
2014 Hekaton 12.00.2000.8 CU1: 12.0.2342.0
CU3: 12.0.2402
CU4: 12.0.2402
CU5: 12.0.2402
CU6: 12.0.2402
CU7: 12.0.2402
CU8: 12.0.2402
CU9: 12.0.2402 
SP1: 12.0.4100.1
SP1: CU#1 12.0.4416.0
SP1: CU#2 12.0.4422.0
2012 Denali 11.00.2100 SP1: 11.00.3000
SP1: CU#7: 11.0.3393
SP1: CU#8: 11.0.3401
SP1: CU#9: 11.0.3412
SP1: CU#16 11.0.3492.0
SP2: 11.0.5058
SP2: HotFix – 11.0.5522
SP2: CU#1: 11.0.5532
SP2: CU#6: 11.0.5592.0
SP2: CU#7: 11.0.5623.0


SQL Server Management Studio - September 2015 Release

The latest Preview of Microsoft SQL Server Management Studio is release and it can be downloaded from;


Here is the list that contains all updates for last three months (quoted directly from the site)
  • SSMS August 2015 Preview
    • Object explorer updates to reduce loading delay when there are a large number of databases.
    • Improvements for installation of SSMS on Windows 10 machines.
    • Bug fixes and updates to SQL Server Configuration manager, and SSMS user reports user interface
  • SSMS July 2015 Preview
    • Database diagrams for Azure SQL Database (V12).
    • Improved IntelliSense support for new temporal table syntax.
    • Updated DacFx library to support latest Azure SQL databases features including row-level security and Azure Active directory authentication.
    • Bug fixes (updated 'check for update' UI, UI fix in 'compatibility level' list and more).
  • SSMS June 2015 Preview
    • New SSMS light-weight web installer.
    • Automatic check for updates.
    • SSMS now has full-text-search support for Azure SQL Database (V12).
    • Top customer requests addressed:
      • 'Edit Top 200 Rows' enabled for tables and views in Object Explorer.
      • Table designer enabled for Azure SQL Database (V12).
      • Database and table property dialogs enabled for Azure SQL Database (V12).
      • New option to skip prompt to save T-SQL files.
      • Import/Export wizard support for new Azure SQL Database service tiers (Basic, Standard, Premium).
      • Numerous bug fixes (scripting scenarios, enabling change tracking for SQL databases, and more).

Monday, September 28, 2015

SQL Server Brain Basher of the Week #031 - SQL Server Editions

Many have been working with Microsoft SQL Server for years and most have worked with either Enterprise or Standard edition. Here is a question on it;

What are the editions available with SQL Server 2014?

As I mentioned above, everyone familiars with Enterprise and Standard but Business Intelligence edition is still unknown to many and it is not so common to see the implementation of it too. Here are all available editions;
  • Enterprise - Premium offering, provides highest level of reliability for demanding workloads.
  • Business Intelligence - Similar to Standard + business intelligence offerings
  • Standard - Delivers complete, reliable data management platform without enterprise features.
  • Express - Free, available for light-weight applications
  • Compact - Free, can be used as an embedded database.
  • Web - Specifically for internet-facing workloads and organizations work heavily in websites and services.
  • Developer - Similar to Enterprise edition but it is for building, testing and demonstrating functionalities.
  • Azure SQL Database - Microsoft cloud offering - offers relational database-as-a-service.
  • Microsoft Analytics Platform System - An appliance combining Microsoft’s massively parallel processing (MPP) data warehouse technology–the SQL Server Parallel Data Warehouse (PDW)–together with HDInsight.
Read more on;
http://www.microsoft.com/en-us/server-cloud/products/sql-server-editions/
https://msdn.microsoft.com/en-us/library/azure/ee336279.aspx
http://www.microsoft.com/en-us/server-cloud/products/analytics-platform-system/



Saturday, September 26, 2015

Configuring HDInsight Hadoop Cluster and running a sample MapReduce Job

Apache Hadoop is an open source solution for distributed data processing and storage that consists a cluster of servers for holding data, storing in a distributed file system named HDFS (Hadoop Distributed File System). This is the solution widely used for handling Big Data and many software vendors offer the same with their platforms. Microsoft offers this as HDInsight.

HDInsight is a cloud-based distribution of Hadoop that is fully compliant with the Hadoop open source project. This offer comes as a service in Microsoft Azure and provisioning and decommissioning nodes of the cluster as-required not as difficult as on-prem-managed clusters.

This speaks about how to create a HDInsight cluster with the new Azure portal and run the sample MapReduce job for testing, using the remote machine. If you need more details on Haddop, HDFS, and related project called Hive, read following;


Let's start with the new Azure portal. Go to https://portal.azure.com and login to your portal. Once logged in, open Storage Accounts (Not Storage Accounts (classic)) and create a new storage account. HDInsight cluster requires at least one storage account for HDFS. If required, more than one Storage account can be attached to the cluster. Best practice with Storage account for Hadoop is, create one and use it when the cluster is created, This allows us to maintain the storage, create the cluster when required, and delete it when it is not required without deleting the storage, retaining data files placed.



Next step is, creating the cluster. Open HDInsight Clusters and create a new one using the Storage Account created. Note that the credentials for accessing the cluster goes as admin (which can be changed if want) and Remote Desktop enables accessing the cluster using Remote Desktop Connection.



You can use Node Pricing Tiers section for setting up nodes required for the cluster, deciding the tier for Worker nodes and Head node.



Generally, it takes about 10-15 minutes for setting up the cluster. Once it is done, navigate through it and open it using Remote Desktop Connection.



When the Remote Desktop is open, HDFS Commands can be used for navigating and manipulating folders and files, running MapReduce jobs and executing commands for related projects such as Pig and Hive. Double click on the shortcut called Hadoop Command Line for opening.



Here is an example for seeing HDFS file system.



Here is an example for executing a jar file.



Here is an example for seeing an output of the executed job.



Refer this page for referring Hadoop Commands
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/FileSystemShell.html



Tuesday, September 22, 2015

Loading Power Query data to Power Pivot: Power Query Default Query Load Setting

This concern can be seen everywhere, why my Power Query data sometime appears in Power Pivot but sometime does not appear in Power Pivot? This is not a bug or an issue, it is all about Default Query Load Settings. When data is loaded using Power Query, the placement of the loaded data is decided by Power Query using Default Query Load Settings. This has mainly two settings;
  • Use standard load settings
  • Specify custom default load settings
    • Load to worksheet
    • Load to Data Model

By default, Use standard load settings is selected and its behavior gets changes based on the way you load data. It loads data to worksheet if only a single table is selected. It loads data to data model when multiple tables are selected or loaded data set is not fit enough to worksheet (row limitation). If the setting is changed to Specify custom default load settings, either one of the options (Load to worksheet or Load to Data Model) can be set or both can be selected.

This is how it works;

With the Use standard load settings selected, when a single table is selected from the data source;



and if it is fit enough for the sheet, it will be loaded to worksheet, not to model.



When a table is selected with Select multiple items checked;



data is loaded to the model.



The same behavior for other data sources too;



If Specify custom default load settings is set, and both Load to worksheet and Load to Data Model are checked;
 


data will be loaded to both worksheet and model.


In addition to that, placement can be decided with Load button in Power Query Loading windows too;


Note that this new configuration was opened with the latest update of Power Query. It can be downloaded from: http://www.microsoft.com/en-us/download/details.aspx?id=39379&WT.mc_id=blog_PBI_General_DI

And you can read more on this update at: http://blogs.msdn.com/b/powerbi/archive/2014/05/06/new-updates-for-power-query.aspx

SQLServerAgent Error: The target server cannot establish an encrypted connection to the master server - MsxEncryptChannelOptions

SQL Server support multi-server administration that allows us to distribute jobs across our enterprise, basically across our SQL Servers. Configuration of multi-server administration is done through SQL Server Agent and one server needs to be configured as master and other servers have to configured as targets. When setting it up, you might get following error at the end of the wizard;

The enlist operation failed (reason: SQLServerAgent Error: The target server cannot establish an encrypted connection to the master server. Make sure that the MsxEncryptChannelOptions registry subkey is set correctly on the target server.) (Microsoft SQL Server, Error: 22026)


What is the reason for this?

Reason for this is, it tries to establish the connection between master and targets in a secure channel with full SSL encryption. It tries this because your systems requests it by default.

If SSL encryption is not enabled between servers, then this setting has to be changed in target servers using the registry. The key MsxEncryptChannelOptions(REG_DWORD) is available under  \HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\\SQLServerAgent for changing the default setting which is 2.


It accepts three values:
  1. 0 - disables encryption between the master and target.
  2. 1 - enables encryption between the master and target but no certification validation is required.
  3. 2 - enables full SSL encryption between the master and target.
If you do not need an encryption channel between the master and target, then set the value as 0 and configure multi-server administration.

Monday, September 21, 2015

SQL Server Brain Basher of the Week #030 - TSQL

Here is the question of the week, it is based on SELECT statements and join mechanism can be used.

Assume that you have two tables like below with data inserted;

USE tempdb;
GO

CREATE TABLE dbo.Customer
(
 CustomerId int identity(1,1) PRIMARY KEY
 , LastName varchar(50) NOT NULL
 , Country varchar(50) NULL
)
GO

CREATE TABLE dbo.Employee
(
 CustomerId int identity(1,1) PRIMARY KEY
 , LastName varchar(50) NOT NULL
 , Country varchar(50) NULL
)
GO

INSERT INTO dbo.Customer
 (LastName, Country)
VALUES
 ('Knight', 'United Kingdom')
 , ('Jack', 'United States')
 , ('Brown', NULL)
 , ('Hamilton', 'Italy');

INSERT INTO dbo.Employee
 (LastName, Country)
VALUES
 ('Khanna', 'India')
 , ('Brown', NULL)
 , ('Jack', 'United States')
 , ('Philips', 'Italy');


And for getting records that are exist in both tables, following queries are written;

SELECT c.LastName, c.Country
FROM dbo.Customer c
 INNER JOIN dbo.Employee e
  ON c.LastName = e.LastName
   AND c.Country = e.Country;

SELECT c.LastName, c.Country
FROM dbo.Customer c
WHERE EXISTS (SELECT * FROM dbo.Employee
   WHERE LastName = c.LastName
    AND Country = c.Country);

SELECT LastName, Country
FROM dbo.Customer 
INTERSECT
SELECT LastName, Country
FROM dbo.Employee ;


What would be the output of each query?

  • Statement with INNER JOIN returns only one record belonged to Jack other two queries return records related to both Jack and Brown.
  • Statements with INNER JOIN and EXISTS return one record related to Jack and statement with INTERSECT returns records related to both Jack and Brown.
  • All statements return records related to both Jack and Brown.
  • All statements return only record related to Jack.
For answering this, you need to understand how NULL values are compared with JOINs, predicates and set operators. Generally Set Operators consider two NULL values as equal but comparison with other methods yield unknown. Considering this fact, the correct answer is, second one because INNER JOIN and EXISTS return only one record but INTERSECT returns two records including the record that has NULL for country.

Sunday, September 20, 2015

What are Spreadmarts in Business Intelligence solutions?

Have you heard about a term called Spreadmart? If you have not, then it is something you should know because your business users might be using spreadmarts and you need to consider them when creating or enhancing your business intelligence solution.

Image was taken from: http://www.startupdonut.co.uk/blog/2012/08/dangers-using-spreadsheets-accounting

What is a spreadmart? Is it another type of data mart or it is a data warehouse? In a way, it can be considered as a data mart because it is something created specifically for one business process or a department by an individual. It is a spreadsheet, example, an Excel workbook maintained by a business user for performing their own analysis with their unique dataset. Business user uses spreadmarts for combining data from multiple sources including existing reports and produces results for her/his analysis.

Although this gives more flexibility to business users, this introduces some issues with collaboration. Therefore it is always better to maintain IT-driven data marts and ask users to use the data mart for creating their data models either as spreadmart or using a modern way like Power BI.

Saturday, September 19, 2015

How to write SELECT for Analysis Services Tabular using DAX

DAX, Data Analysis Expression is a language that can be used with Microsoft Analysis Services tabular data models. It was initially introduced with Power Pivot Excel Add-In and later it was added to Analysis Services tabular models. DAX language holds collection of functions, operators and constants that can be used for creating calculated columns and measures, extending tabular models.

Since we always write queries with SELECT with database engine and Analysis Services multi-dimensional model, requirement for writing the same is not uncommon with tabular model too. How can we write a SELECT query with DAX?

Retrieving data from tabular model tables can be done with EVALUATE keyword, which is functionality equivalent to SELECT statement. Although it is not as rich as TSQL SELECT, it can be used for getting data for general reports.

Here are some examples;

-- Getting data from Internet Sales
evaluate ('Internet Sales');

-- With filtering
evaluate(Filter('Internet Sales', 'Internet Sales'[Product Standard Cost] > 1000));

-- With ordering
evaluate(Filter('Internet Sales', 'Internet Sales'[Product Standard Cost] > 1000))
order by 'Internet Sales'[Order Date] desc;


Friday, September 18, 2015

Disable Report Builder from Reporting Services Report Manager

Report builder is a report authoring tool that is specifically given for business users who are familiar with ribbon-based interface of Microsoft Office. It is the primary tool that addresses all self-service reporting scenario and a user-friendly tool.

Report builder has two versions, functionally equal but the way we open it for business user is different. Once version comes as Click-Once application which can be opened using Report Manager and other one comes as a stand-alone application which can be downloaded from Microsoft Download Center. When you open the Report Manager, the shown button can be clicked for downloading the Report Builder to the client machine.


Now, what if you want to disable this facility and make sure users cannot download the Report Builder from Report Manager? What you have to do is, connect with Reporting Services using Management Studio and get Advanced Properties opened. It has a property called EnableReportDesignClientDownload which can be used for enabling and disabling it.


Once it is set as false, the button Report Builder will no longer be appeared.


Thursday, September 17, 2015

Power BI Desktop with a Live Connection - Creating Analytical Reports without saving data in Power BI

Many use or plan Microsoft Power BI for reporting and dashboard solutions with an assumption that is, Power BI always loads selected data for forming an analytical model and saving with it, just like using Power Pivot and View with Excel. This leads to a question: If a business user wants to create a report for analysis, can she/he directly connect with the source and create without loading data into client application?

With Power BI Desktop, it is possible. But it is possible only with Analysis Services Tabular Mode database as a source. User can create his own report connecting with Analysis Services Tabular instance and save the report, just like creating a report using Reporting Services Report Builder.

Here are the steps for connecting and creating a report;

Open Power BI Desktop and Connect with Analysis Services Tabular Mode instance. 



And make sure that Explore the Tabular model by using a live connection radio button is selected.


As you see with below screen, you cannot select entities and attributes but the model.


Once connected, all elements and attributes are shown just like a usual connection allowing you to create reports as you want. Once done, you can save it, and you will see that saved file is very small compared to general Power BI files.



Generally business users do not upload created reports if there is no requirement on sharing. But if it is required, can we upload the created report to Power BI services just like other reports created?

If you try to publish this report, you will get following issue;

An error occurred while attempting to publish Test.pbix.


This can be handled by installing and configuring Analysis Services Connector. Once it is installed, it can be used for connecting and creating reports and published reports will connect to your source via that.

For more info on that and how to install, read this: Configure a Power BI Analysis Services Connector

For more info on How to create reports using Power BI Desktop, read this: Getting started with Power BI Desktop

For more info on How to create reporting using Analysis Services Tabular data, read this: Analysis Services Tabular data in Power BI Desktop

Monday, September 14, 2015

SQL Server Brain Basher of the Week #029 - SSAS

Analysis Services can be installed in one of three modes; Multi-dimensional, Power Pivot for SharePoint, and Tabular. One instance can be run with only one mode and it has to be set during the installation.

Once installed with a mode selected, can we change or switch the mode?

There is no defined method for doing it with a specific menu but there is a way of doing it which is not a solution provided by Microsoft. Microsoft does not recommend to change mode after the installation but it can be done by changing the msmdsrv.ini file. It requires no added objects (databases), in some situations, removing added objects (databases) and changing the mode has worked fine but specially with the mode set for Power Pivot may produce an unexpected result.

The msmdsrv.ini is located in C:\Program Files\Microsoft SQL Server\MSAS12.SQL2014\OLAP\Config path and the property called DeploymentMode can be changed as 0 (Multi-dimensional), 1 (Power Pivot for SharePoint), or 2 (Tabular).


Note that, though this works in many cases, it is NOT recommended.

And if you want to know the mode you have installed, DeploymentMode can be used for determining it but easiest way is by looking at the icon of the instance. This shows multi-dimensional, Power Pivot for SharePoint and Tabular in order.



Sunday, September 13, 2015

Do you know that Power BI is available now and you can use it FREE?

Microsoft Power BI, bringing your data to life, yes it is available and can be used FREE :).


Power BI is a cloud-based business analytics service that facilitates you for converting your data into meaningful information with rich, interactive and well formed visuals, aligning with a concept of Any data, anywhere, any time. This offers Power BI Mobile that can be used for seeing dashboards and reports created, and analyzing as you used to, making sure your most important data travels with you. And it offers Power BI Desktop that can be used for creating stunning reports and interactive visualizations.

It is FREE, but don't forget that it has a paid version too. Power BI Pro is the paid version and it is USD 9.99 user/month. Not much differences, however key differences are;
  • Data capacity limit is 1 GB/user for FREE Power BI and 10 GB/user for Power BI Pro.
  • Content refresh is Daily for FREE Power BI and Hourly for Power BI Pro.
  • Streaming data in dashboards is limited to 10K rows/hour for FREE Power BI and 1M rows/hour for Power BI Pro.
  • And features related to Collaboration is limited to Power BI Pro.

Saturday, September 12, 2015

The permission granted to user are insufficient for performing this operation - Cannot deploy reports from SQL Server Data Tools to Reporting Services

Are you experiencing this error when trying to deploy reports from SQL Server Data Tools to Reporting Services site?



As it says, it clearly indicates that you have no permission to deploy the created report. Of course, you need to check and see permission granted to you, but before that, try to open SQL Server Data Tools with administrator privileges and see;


After that, you should be able reports.

Friday, September 11, 2015

How to enable SQL Server Audit for Windows Security Log

I have written few posts on SQL Server Auditing and they are written based on either Windows Event Log or File as the target of the Audit;


There is another option that can be set for the target which is Windows Security Event Log which is the most secured place for placing audit info. But, it cannot be set as simple as other targets, it needs some extra steps.

If I have created an Audit like below;


and try to enable;


I get this error. Reason for this is, for setting the target as Windows Security Log, SQL Server Service Account has to be a member of Generate Security Audit Policy. For that, open Local Security Policy and add SQL Server Service Account to Generate Security Audit Policy.


You may need to restart the SQL Server before trying with enabling the Audit created. Once done, you will be successfully enable the Audit.


Thursday, September 10, 2015

SQL Server Audit for application event - User-Defined Audit Actions

SQL Server Audit is the main tool for capturing events related to server and databases, and it can record event information to files, Windows Application Event Log, and Windows Security Event Log. All editions of SQL Server support server-level auditing but only Enterprise, Developer and Evaluation editions support database-level auditing. 

When defining an Audit, it needs to specify an Audit (1), then either Server-Specification or Database-Specification (2) specifying what Action/Action Group need to be included. Action/Action Groups are pre-defined comprehensive events that cover almost all events related to server and databases. Once configured, events are recorded with the given target. What if we want to raise our own events and get the captured with the same Audit?

Good example is, tracking changes in Product table, specifically on the Price column. If an Audit is already defined for other events, and need to use same target for recording price changes too, all we have to do is, include USER_DEFINED_AUDIT_GROUP Action Group to either Server or Database specification and use sp_audit_write stored procedure for recording our custom event. Here are steps for doing it.

Let's create an Audit first. If you have already created an Audit, you do not need to create it. Go to security node in the Object Explorer , then Audit node, and then create a new Audit.



Next, let's create a Server-Specification. Note that I have included two events; FAILED_LOGIN_GROUP and USER_DEFINED_AUDIT_GROUP. The second one added is for custom events.



Capturing ListPrice changes can be easily implemented with a Trigger and sp_audit_write procedure can be called within it. Here is the sample code for implementing it.

USE AdventureWorks2014;
GO

CREATE TRIGGER Production.CheckPriceChanged ON Production.Product
AFTER UPDATE
AS
BEGIN

 DECLARE @Message nvarchar(4000)
  , @OldPrice money
  , @NewPrice money
  , @ProductNumber nvarchar(25)

 IF UPDATE(ListPrice)
 BEGIN
 
  SELECT @OldPrice = ListPrice FROM deleted;
  SELECT @NewPrice = ListPrice, @ProductNumber = ProductNumber FROM inserted;

  SET @Message = N'Price changed: Product Number: ' +  @ProductNumber + N' Old price: ' + CONVERT(nvarchar(20), @OldPrice) + N' New price: ' 
      + CONVERT(nvarchar(20), @NewPrice) + N' User: ' +  CONVERT(nvarchar(20), SYSTEM_USER)

  -- Calling auditing stored procedure
  EXEC sp_audit_write 1, 1, @Message;
 END 
END;
GO

Let's make some changes and see how it is going to be recorded.

-- Updating records
UPDATE Production.Product
 SET ListPrice = 100
WHERE ProductID = 1;

UPDATE Production.Product
 SET ListPrice = 0
WHERE ProductID = 1;

-- Checking the audit output with fn_get_audit_file
SELECT event_time, object_name, user_defined_event_id, statement, user_defined_information
FROM sys.fn_get_audit_file ('c:\Audit\*',default,default);


Wednesday, September 9, 2015

Reporting Services Optional Multi-Valued Parameters with Analaysis Services as the source

Creating multi-valued parameters with reports is a common requirement and it can be easily done with Reporting Services, making Allow multiple values property enabled with the report. However, if you need to make it as optional, allowing user to continue with the report without selecting any item in the parameter, you will face an issue because Reporting Services does not allow user to continue without selecting an item if the parameter is multi-valued parameter. Not only that, you cannot enable Allow null value property too.

If your source for the parameter is SQL Server database, workaround is, adding a NULL value to the dataset manually (or as a part of the query) and use it for handling optional requirement. If you google, you can find the way of implementing this workaround in many forums and sites. What if the source for the parameter is Analysis Services, then this workaround does not work as it is, it needs some extra work.

Here is the way I implemented, it may not be the perfect method but it worked fine for me.

Assume that I need to load Products for a parameter and make it as multi-valued parameter, my MDX for that will be something like below;

WITH 
MEMBER [Measures].[Product Name]
 AS [Product].[Product].CurrentMember.Name
MEMBER [Measures].[Product Unique Name]
 AS [Product].[Product].CurrentMember.UniqueName
SELECT
 { [Measures].[Product Name], [Measures].[Product Unique Name] } ON 0
 , {[Product].[Product].[Product].Members } ON 1
FROM [Adventure Works];


However, I cannot make the parameter as optional because of the limit, hence I need a member that represents NULL. For that, I need to introduce query-scoped calculated member and add it to the resultset;

WITH 
MEMBER [Measures].[Product Name]
 AS [Product].[Product].CurrentMember.Name
MEMBER [Measures].[Product Unique Name]
 AS [Product].[Product].CurrentMember.UniqueName
MEMBER [Product].[Product].[All Products].[(NULL)]
 AS Aggregate([Product].[Product].&[(NULL)])
SELECT
 { [Measures].[Product Name], [Measures].[Product Unique Name] } ON 0
 , {[Product].[Product].[Product].AllMembers } ON 1
FROM [Adventure Works];


Now just like parameters loaded from SQL Server databases, I have a (NULL) item in the parameter for handling the situation. If you get an issue with sorting the result that MDX produces, alternative way for that is, executing the MDX using a TSQL Stored Procedure connecting to Analysis Services via a Linked Server object.

CREATE PROC dbo.GetProductsFromSSAS
AS
BEGIN

 DECLARE @Query nvarchar(max)
 
 SET @Query = N'
  SELECT 
   "[Measures].[Product Name]"
   , "[Measures].[Product Unique Name]"
  FROM 
   OpenQuery(OLAPDatabase, ''
     WITH 
     MEMBER [Measures].[Product Name]
      AS [Product].[Product].CurrentMember.Name
     MEMBER [Measures].[Product Unique Name]
      AS [Product].[Product].CurrentMember.UniqueName
     MEMBER [Product].[Product].[All Products].[(NULL)]
      AS Aggregate([Product].[Product].&[(NULL)])
     SELECT
      { [Measures].[Product Name], [Measures].[Product Unique Name] } ON 0
      , {[Product].[Product].[Product].AllMembers } ON 1
     FROM [Adventure Works];
    '')
  ORDER BY 1;'
 
 EXEC SP_EXECUTESQL @Query;
 
END
GO




Tuesday, September 8, 2015

Is MDX query slow? Here are four guidelines for improving performance



The key reason for implementing OLAP databases is performance related to data retrieval and what if the wrote query does not offer expected performance? In a way, troubleshooting MDX query is not as easy as troubleshooting TSQL, if you are not much familiar with the concepts related to MDX statements. However, you can keep following guidelines in mind and do the needful if you experience slow performance with written MDX statement.

Guidelines are based on the time spent on Query Processor and Storage Engine. Duration can be determined using following events with Profiler;

  • Query Subcube event - Duration column - for time spent on extracting data from the storage.
  • Serialize Results End event - Duration column - for time spent on manipulating data after taken from the storage.
Here are four guidelines;
  1. If you see more time on Query  Serialize Results End event - Duration (Processing) than Subcube event - Duration (Storage), you need to optimize the MDX statement reducing added calculation and complexity.
  2. If you see more time on Query  Subcube event - Duration (Storage) than Serialize Results End event - Duration  (Processing), it shows that it needs partitioning, user-defined hierarchies and attribute relationship among used attributes defined.
  3. If you see more time on Query  Subcube event - Duration (Storage) than Serialize Results End event - Duration  (Processing)and data is retrieval rarely from aggregation, it needs aggregation added based on usage.
  4. If you see more time on Query  Subcube event - Duration (Storage) than Serialize Results End event - Duration  (Processing) and data is retieved from the cache, you need to check the resource usage of the server and if required, scale up by adding memory, CPU, etc.
Await for next post on how to use the Profiler for capturing these for troubleshooting.

Monday, September 7, 2015

SQL Server Brain Basher of the Week #028 - Authentication

Microsoft SQL Server supports two type of authentication modes; Windows authentication and Mixed authentication. Windows authentication is always recommended unless some applications require SQL Server logins for connecting with SQL Server instance. Here is the Brain Basher of the week based on it.

If Windows authentication is selected during the installation, and later authentication is enabled to Mixed authentication, what step(s) you have to take for using sa login?

  1. No additional steps required. sa login can be used immediately with a blank password.
  2. Enable sa login either using ALTER LOGIN statement or Login Properties GUI for sa.
  3. Reset the system assigned password of sa either using ALTER LOGIN statement or Login Properties GUI for sa.
  4. Assign sa login to all databases either using CREATE USER statement or Login Properties GUI for sa.


If Windows authentication is selected during the installation, sa login is still created with a system password and kept as a disabled SQL login. If you need to use sa login after enabling to Mixed authentication, you need to perform step (2) and step (3).


Saturday, September 5, 2015

Analysis Services: Cannot see applied color for calculated measures

Color expression section of Calculations tab of Cube Designer allows us to set Fore Color and Back Color for the Calculated Measures/Members based on an expression. Once it is set properly, we should be able to see colored values for the measure created based on the expression evaluation. What if you do not see it with your MDX statement?

For example, see the below Calculated Measure called Profit, which has a Color Expression set.


And when you browse it with Browser tab, you do not see the color;


And when you get it using MDX, still it does not show the color;


But when it is opened with Microsoft Excel, it CAN BE SEEN.


What is the reason, since it can be seen with Excel, means expression set is correct and it should be worked.

The reason for MDX statement is, it needs CELL PROPERTIES to be added to the statement in order to see colors with management studio. Not only with management studio, even with other applications. Here is the way;


Thursday, September 3, 2015

Granting permission on tables - individual tables or schema?

When there is a requirement for granting permission on tables for database users, common or known way of doing it is, granting explicit permission on individual objects for users. But, if database objects are schema-based objects and properly organized, then most appropriate way of managing permission is, granting permission on schema rather than granting permission on tables (or any other objects). The main reason for this is, any granted permission on a schema will by implicitly applied to all objects defined under the schema.

Example, if there are two tables called Product and ProductCategory and they have been created under Production schema, granting SELECT permission on Production schema is totally enough for granting SELECT permission to Product and ProductCategory tables.

Here is a test code;

/*
Creating a login and user called Jack
*/

USE master;
GO

CREATE LOGIN Jack WITH PASSWORD = 'Pa$$w0rd'
 , CHECK_POLICY = OFF;
GO

USE AdventureWorks2014;
GO

CREATE USER Jack FOR LOGIN Jack;
GO

-- Check and see whether Jack has permission on HumanResources schema
-- Login as Jack and see permissions
EXECUTE AS USER = 'Jack';
GO

SELECT * FROM HumanResources.Department;
GO

REVERT;
GO

-- Granting permission to the schema
GRANT SELECT ON SCHEMA::HumanResources TO Jack;
GO

-- Deny permisson on one table for testing
DENY SELECT ON HumanResources.Employee TO Jack;
GO

-- Login as Jack and see permissions
EXECUTE AS USER = 'Jack';
GO

-- Jack can executes this
SELECT * FROM HumanResources.Department;
GO

-- And this too
SELECT * FROM HumanResources.JobCandidate;
GO

-- But not this as we have explicity denied permission
SELECT * FROM HumanResources.Employee;
GO

REVERT;
GO

-- Cleaning up
DROP USER Jack;
GO
DROP LOGIN Jack;
GO


Wednesday, September 2, 2015

Finding server level permission (principals) of a login

We create SQL Server login principal for allowing users to connect with the instance of SQL Server. A standard logins are the most common logins but some logins have special permission making them part of Server Roles. How can I quickly find assigned principals or server level permission, whether granted or denied for a login?

Simplest way is, using sys.login_token that shows one row for each server principal that is attached to the login.


Here is a sample code that shows the usage of it;


-- Create a SQL Server login
CREATE LOGIN James WITH PASSWORD = 'Pa$$w0rd';
GO

-- Adding James to serveradmin fixed server role
ALTER SERVER ROLE serveradmin ADD MEMBER James;
GO

-- Checking James assigned principal
EXECUTE AS LOGIN = 'James';
GO

SELECT * FROM sys.login_token;

REVERT;
GO



As you see, Login Token of James is part of SQL Login, public Server Role, and serveradmin Server Role.