Wednesday, November 24, 2010

Analysis Services Security: Dimension Data – Select all members or Deselect all members?

While configuring Analysis Services security on Dimension Data, I did not pay much attention on two options available with the screen; Select all members and Deselect all members. However the importance of them was raised up when my colleague Asanka Padmakumara showed a point on them. Which one should we select? See the screen below;

I have selected the first option which is Select all members and granted permission on Accessories and Bikes. I can do the same, select the other option too. Let’s move into Advanced section and see how it has been set;

Note that other two categories (Clothing and Components) have been added under Denied member set. What we have set? Yes, we granted permission on Accessories and Bikes but we have done indirectly is, denying permission on Clothing and Components. Main thing we need to consider is, permission on future categories. Since the items we did not select have gone to Denied member list, all future items will be available for this role. If you need to let this role to access future categories without explicitly setting them, use Select all members for setting permission.

Now let’s use the other option; Deselect all members;

I have given same permission set but the option selected is Deselect all members. Have a look on Advanced section now;

Now you can see the items we selected have been added to Allowed member set. This means all other items we did not select are denied to this role. What will happen to future categories? This is the difference. Since we have explicitly added Accessories and Bikes to Allowed member set, future items will NOT be available under Allowed member set. If you need to disallow this role to access future categories without explicitly setting them, use Deselect all members for setting permission.

Monday, November 22, 2010

Analysis Services Security: DENY does not override GRANT

Is the famous rule of thumb broken? I am sure that you all are aware the rule and you know that DENY always wins over GRANT. If you want to see an exception (or a situation) read one of my previous posts “Column GRANT overrides DENY TABLE, Avoiding with Common Criteria Compliance”.

This post discusses another situation, it is not with SQL Server Relational Engine, it is will SQL Server Analysis Services.

Analysis Services behaves bit differently. As you know, if an object of SQL Server database is denied explicitly, it is always denied even with an explicit grant. For example, if you are in a Role that grants SELECT on Table1 and in another role that denies SELECT on Table1, you will not be able to execute SELECT on Table1. Analysis Services authorizes on items differently. It unions all grants and authorizes. If you are in a Role that grants on Item1 and Item2 (Note that in Analysis Services, once items are granted, all other items are denied, no need to explicitly deny them as SQL Server database engine objects) and in another Role that grants Item2 and Item3, you are granted for Item1, Item2, and Item3.

Let me show you a sample on this. The below image shows a Role in Analysis Services Adventure Works DW 2008R2 database. This is how it is created;

  • Name: Business Users
  • Membership: Dinesh
  • Cubes: Adventure Works – Read
  • Dimension Data:
    • Product Dimension, Category – granted for Accessories and Bikes
    • Other Dimensions – granted for all


Second image shows another role in same database. Its properties are as follows;

  • Name: Coordinators
  • Membership: Dinesh
  • Cubes: Adventure Works – Read
  • Dimension Data:
    • Product Dimension, Category – granted for Bikes and Clothing
    • Other Dimensions – granted for all


Business Users role denies for Clothing and Components. Coordinators role denies for Accessories and Components. When Dinesh sees data, he sees union of both; Accessories, Clothing, and Bikes. The reason for this is, Analysis Services uses all GRANTs from all Roles for connected user and then authorizes. That is why he sees Accessories and Clothing even though they are denied from each role. This is how he sees when Categories are loaded with Excel.

Note: This type of security cab be tested with Management Studio too. Start Cube Browser and click on Change User icon. The opened window allows you to set the context with Current User, Other User, or Roles.


Tuesday, November 16, 2010

Way to Visualize: Project Crescent; What is it?

The Era that Business User is used to carry an armload of tools for analyzing and visualizing data, has come to an end!!!!! Yes, Microsoft never stops making business user’s life easier. Microsoft never stop supporting business user to manage and visualize data in a rich way. Microsoft has come out with an exciting facility; The Project Crescent.

What is Project Crescent?

Project Crescent is the recent Microsoft Business Intelligence team innovation that helps to manage and visualize large amount of data quickly in very rich format. It has been built entirely on Silverlight and will be available as a BI client. It allows to show the insights of the business with reports, with highly interactive visualization and animations.

The releasing date of this is unknown. Most probably it will be released with Denali, at the end of 2011 or beginning of 2012. Long time to wait…..

Here is a sneak peak of project Crescent, in Amir Netz’ demo at PASS Summit 2010:

Here is a post from SQL Server Reporting Services team blog:

Friday, November 12, 2010

Reporting Services: Difference between RDL and RDLC files

Microsoft Reporting Services is an enterprise-capable reporting solution that can be used to create all types of reports such as Production, Analytical, and Dashboard. Building reports is done through Business Intelligence Development Studio which is an extension of Visual Studio. Reports are created as RDL (Report Definition Language) files. Once the reports are created with BIDS, they (RDLs) are hosted in Reporting Services and users can be accessed them in many ways.

What are RDLC files?
RDLC: Report Definition Language, Client Side files are files that are generated with either ReportViewer Control that comes with Visual Studio 2008 Windows or ASP.NET project templates, or Reporing –> Report Application project template. It is a XML file, just like the RDL. It uses the same XML schema that is used by RDL files but RDLC does not require to have values for all elements such as query text. If RDLC file is needed to be hosted in Reporting Services, it needs to be renamed as RDL and all empty elements must be filled manually.

Reporting with ReportViewer Control
The ReportViewer Control allows us to embed a report to ASP.NET web page or Windows form. It supports two modes: Remote Processing mode and Local Processing mode. The Remote Processing mode allows to connect with reports that have been already deployed to SSRS instance. In this case, all the processing take place at the SSRS server. The Local Processing mode allows to create reports within Visual Studio itself. The extension of them are RDLC. This does not require an instance of Reporting Services, hence processing is handled by the client application.

Here are some of important points to remember about these two modes:

  • Local Processing mode does not require an instance of Reporting Services, hence a license for Reporting Services is not required.
  • Local Processing mode is not recommended for large reports. It is used for small, OLTP types of reports. They may run infrequently.
  • Remote Processing can be scaled out but Local Processing.
  • Local Processing mode supports following extensions only:
    • Visual Studio 2008: PDF and Excel
    • Visual Studio 2010: PDF, Excel and Word
  • Local Processing mode with Visual Studio 2008 does not support Tablix data region. Visual Studio 2010 supports.
  • Mapping parameters with query parameters has to be done manually with Local Processing mode.

Thursday, November 11, 2010

Future version of SQL Server: “Denali”, CTP available for downloading

Microsoft announced that Community Technology Preview of SQL Server code-named “Denali” available for downloading. This CTP contains enhancements mainly on SQL Server high availability and engine not with BI components such as Analysis Services and Reporting Services.

To see details:

To download:

To see the enhancements:

Thursday, November 4, 2010

Ready-made Virtual Machine for SharePoint 2010

Are you looking for pre-built virtual machine that includes SharePoint 2010? If yes, here it is… it is available for downloading.

Not only SharePoint 2010, it contains:

  • Microsoft Office Communication Server 2007 R2
  • Microsoft Visual Studio 2010
  • Microsoft SharePoint Server 2010 Enterprise Edition
  • Microsoft Office Web Applications
  • Microsoft FAST Search for SharePoint 2010
  • Microsoft Project Server 2010
  • Microsoft Office Professional Plus 2010
  • Microsoft Visio 2010
  • Microsoft Project 2010
  • Microsoft Office Communicator 2007 R2

For more details and downloading: 2010 Information Worker Demonstration and Evaluation Virtual Machine.

Tuesday, November 2, 2010

SharePoint Videos: SharePoint 2010 Developer Training Course

Looking for resources to learn SharePoint 2010 development? So many books, articles, and blog posts? Here is an easy way to learn it;

Site: SharePoint 2010 Developer Training Course
Download link: SharePoint 2010 Developer Training Kit

This training kit provides videos and hands on labs which are very useful to understand how SharePoint 2010 works and how it can be used.

We must thank Chris Mayo, Donovan Follette, Girish Raja, Paul Stubbs, and Steve Fox for preparing something like this for developers.

Monday, November 1, 2010

SSIS: Replacing Slowly Changing Dimension (SCD) wizard with the MERGE statement

Slowly Changing Dimensions are business entities in DW/BI systems that have attributes that do not change very often. In order to handle loading of SCDs, we use SQL Server Integration Services Data Flow component: Slowly Changing Dimension Component. It is a wizard that helps us to handle changing attributes.

If you are a SSIS developer and have used SCD wizard, you have already seen the bad performance on it. It works fine with a dataset that contains records less than something around 5000 (this is what I have seen, please do not consider it is as a benchmark) but gives very slow performance on beyond. Recently, I noticed that one of the projects done had used SCD wizard for almost all data loading and the time it takes for completing the load has gone from 5 minutes for 5-6 hours. It significantly increases the time when the data load is getting higher and higher.

How to improve the performance of loading of SCDs?
I had to give a solution for this…. two things came into my mind, one is Kimball Method SSIS Slowly Changing Dimension Component (will make a separate post on this) and the other is MERGE T-SQL statement. Although the MERGE implementation is bit complex, I thought to recommend the MERGE because many have accepted that the performance it gives is superb. The below table shows a comparison made on three different approaches on SCDs loading with some criteria (which is quoted from Matt Masson’s presentation named Performance Design Pattern).


You can easily notice that MERGE gives the best performance though there are drawbacks.

Type of Slowly Changing Dimensions
Slowly Changing Dimensions are categorized into three types named: Type 1, Type 2, and Type3. The Type 1 SCD does not maintain the history of changing attributes, it overwrites values of the attributes. Type 2 maintains historical values for changing attributes. Type 3 that we do not use much (I have not used it at all) maintains separate columns for changed attributes. SSIS SCD wizard supports both Type 1 and Type 2.

Using MERGE instead of SCD wizard
Replacement of SCD wizard with MERGE is not a straightforward technique. If the SCD has both Type 1 and Type 2 types attributes, they need to be handled separately. Let’s create a simple table with following set of data, in order to see the way of handling them with MERGE.
table1 Following attributes in this table are Type 1:

  • FirstName
  • LastName

Following attributes are Type 2:

  • MaritalStatus
  • Country

Following script creates the Dimension table and inserts data;

CREATE TABLE dbo.DimCustomer
    CustomerKey int IDENTITY(1,1) PRIMARY KEY
    , SourceCode char(5) NOT NULL
    , FirstName varchar(50) NOT NULL -- Type 1
    , LastName varchar(50) NOT NULL -- Type 1
    , MaritalStatus char(1) NOT NULL -- Type 2
    , Country  varchar(50) NOT NULL -- Type 2
    , StartDate date NOT NULL
    , EndDate date NOT NULL
    , CONSTRAINT IX_DimCustomer_SourceCode UNIQUE (SourceCode)
INSERT INTO dbo.DimCustomer
    (SourceCode, FirstName, LastName, MaritalStatus, Country
    , StartDate, EndDate)
    ('AAAA1', 'Dinesh', 'Priyankara', 'M', 'Sri Lanka'
        , '01/01/2009', '12/31/9999')
    , ('AAAA2', 'Yeshan', 'Senthush', 'S', 'Sri Lanka'
        , '01/01/2009', '12/31/9999')
    , ('AAAA3', 'Jane', 'Alvarez', 'S', 'France'
        , '01/01/2009', '12/31/9999')
    , ('AAAA4', 'Shannon', 'Carlson', 'M', 'India'
        , '01/01/2009', '12/31/9999')

When data loading of SCDs are handled with SSIS,what we usually do is, once the data set is captured, we send it through SCD component. The SCD component inserts new data, updates Type 1 data, and finally inserts Type 2 data. If we are to use MERGE, we need to send the captured data to a temporary table and then use Execute SQL Task for executing the MERGE statement. Assume that following table is the temporary table and it is loaded with captured data. 

This code creates the table and populates data;

CREATE TABLE dbo.TempDimCustomer
    SourceCode char(5) NOT NULL
    , FirstName varchar(50) NOT NULL
    , LastName varchar(50) NOT NULL
    , MaritalStatus char(1) NOT NULL
    , Country  varchar(50) NOT NULL
INSERT INTO dbo.TempDimCustomer
    (SourceCode, FirstName, LastName, MaritalStatus, Country)
    ('AAAA1', 'Dinesh', 'Priyankara', 'M', 'Sri Lanka')
    , ('AAAA2', 'Yeshan', 'Santhush', 'S', 'Sri Lanka')
    , ('AAAA3', 'Jane', 'Alvarez', 'M', 'France')
    , ('AAAA4', 'Shannon', 'Carlson', 'M', 'German')
    , ('AAAA5', 'Jon', 'Yang', 'M', 'Japan')

Handling Type 1 attributes
The below codes shows the MERGE statement that updates Type 1 attributes. Note that it goes through all records and updates.

MERGE INTO dbo.DimCustomer c
USING dbo.TempDimCustomer tc
ON (c.SourceCode = tc.SourceCode)
        (c.FirstName != tc.FirstName
        OR c.LastName != tc.LastName)
        SET c.FirstName = tc.FirstName
            , c.LastName = tc.LastName;

Handling Type 2 attributes
Handling Type 2 is tricky. If a record is changed, we need to update the old record, setting the EndDate as current date, and then insert a new record for it, with StartDate as current date. See the code below;

    (SourceCode, FirstName, LastName, MaritalStatus, Country
    , StartDate, EndDate)
SELECT Modified.SourceCode, Modified.FirstName
    , Modified.LastName, Modified.MaritalStatus
    , Modified.Country, Modified.StartDate
    , Modified.EndDate
    (MERGE INTO dbo.DimCustomer c
    USING dbo.TempDImCustomer tc
    ON (c.SourceCode = tc.SourceCode)
            (SourceCode, FirstName, LastName, MaritalStatus, Country
            , StartDate, EndDate)
            (tc.SourceCode, tc.FirstName, tc.LastName, tc.MaritalStatus, tc.Country
            , GetDate(), '12/31/9999')
        AND c.EndDate = '12/31/9999'
        AND (c.MaritalStatus != tc.MaritalStatus
            OR c.Country != tc.Country)
            SET c.EndDate = getDate()
            $Action Action, tc.SourceCode, tc.FirstName, tc.LastName
            , tc.MaritalStatus, tc.Country
            , '10/28/2010' StartDate, '12/31/9999' EndDate) AS Modified
WHERE Modified.Action = 'UPDATE';

See the inner MERGE first. With first NOT MATCHED section, it inserts new records. Next is for finding MACTHED and Type 2 changes. Note that the check is done only on latest records (EndDate = ‘12/31/9999’). If found, EndDate is updated and records are returned as a OUTPUT of the MERGE. This output is captured by outer INSERT statement and inserts them as new records. Done!

Please do test both approaches before deciding the technique. If the data set is large, it would be better to use MERGE instead of SSIS SCD component.