Monday, May 11, 2020

Coronavirus (Covid-19) Dashboard solution with Power BI

If you are interested in #covid-19 statistics, check this #dashboard out. This has been developed using #powerbi, #azuresqldatabase and #azuredatafactory.

http://coronavirusexplorer.com/



Sunday, April 19, 2020

Azure Databricks - Accessing Data Lake using Azure Active Directory Credentials

This video discusses the way of accessing Azure Data Lake Gen 2 through Azure Databricks, using Azure Active Directory Credentials. Following are discussed;
  • Ways of Accessing ADLS Gen 2
  • Enabling Azure Data Lake Credential Passthrough
  • Demo: Accessing ADLS Gen 2 in Azure Databricks
    • Creating an Azure Databricks workspace
    • Configuring Standard Cluster for AAD Credentials
    • Configuring High Concurrency Cluster for AAD Credentials

Azure Databricks - Accessing Data Lake - Using a Service Principal

This video discusses the ways of accessing Azure Data Lake Gen 2 in Azure Databricks, specifically using a Service Principal. Following are discussed in detail;
  • Ways of accessing Azure Data Lake Gen 2
  • Creating a Service Principal for accessing Azure Data Lake
  • Access using Role-based access control
  • Access using ACL-based access control

Wednesday, April 8, 2020

Sending Emails from Azure SQL Database - YouTube Video

This video focuses on sending emails for operations related to Azure SQL databases. Following are discussed and demonstrated;

  • Possibilities of tracking changes in Azure SQL Database for sending emails
  • Setting up an Azure SendGrid account
  • Setting up an Azure Logi Apps for sending emails

Data Science with SQL Server Machine Learning Services - YouTube Video

This video discusses performing data science experiments in SQL Server using Python. Setting up the environment, training, storing models and consuming trained models stored in the database are discussed with a demonstration.


Automating Resume and Pause of Azure Analysis Services - YouTube Video

This video discusses how to automate starting and stopping Azure Analysis Services using Azure Automation Account. Following are demonstrated;
  • Create Azure Automation Account
  • Import libraries required
  • Create a PowerShell Runbook
  • Schedule the Runbook

Saturday, March 21, 2020

Power BI App for seeing Coronavirus - COVID-19 updates in Sri Lanka

Here is an Azure site built for showing updates related to Coronavirus - COVID-19, specifically for Sri Lanka.

This has been designed using Azure Data Factory, Azure SQL Database, Power BI and Azure App Services.

 Coronavirus COVID-19 - Sri Lanka - Update

Tuesday, April 2, 2019

Azure Databricks - What, Why & How - YouTube Video

Here is the first video on Azure Databricks. This talks about;

  • Why we need to Azure Databricks for processing data
  • What is Azure Databricks
  • Demo: How to provision an Azure Databricks workspace, a cluster, a database and a table.



Monday, February 18, 2019

Creating Paginated Reports with Power BI

Here is a video on Power BI Paginated reports. This videos discusses/demonstrates;

  • What is a paginated report?
  • Power BI support on paginated reports
  • Power BI configuration for paginated reports
  • Creating a paginated report using Report Builder
  • Publishing a paginated report sourced to local database, to Power BI portal.
Here is the video:

Tuesday, July 3, 2018

Azure Data Lake Analytics - Accessing External Data Sources with USQL

This video discusses the way of accessing external data sources such as Azure SQL Database with USQL as external data sets. It explains why we need it, what objects should be created for accessing external sources and how external data can be accessed in USQL codes.

Watch and see!


Sunday, June 24, 2018

Azure Data Factory V2 - Lookup and ForEach Activities

Here is my third video on Azure Data Factory. This video discusses and demonstrates;

  • Why we need iteration in ETLs
  • Lookup Activity
  • ForEach Activity
  • Demo: Iterating over a data collection using ForEach Activity, with data loaded from Lookup Activity

Thursday, February 1, 2018

Azure Data Factory V2 - Creating a Pipeline using Author-Tab

Here is another video on Azure Data Factory V2. This video explains how to create a Pipeline manually, using Author-Tab. This video discusses and demonstrates;
  • How to create Linked Services manually
  • How to create Datasets manually
  • How to create a Pipeline and add Activities
  • And how to schedule the Pipeline






Thursday, January 25, 2018

Azure Data Factory V2 - Copying On-Premise SQL Server data to Azure Data Lake Store

Azure Data Factory has been enhanced significantly with V2 and its support on Cloud-ETL-and-ELT is excellent now. First version of it was not much user-friendly but now it is not that difficult to implement ETL solutions with V2.

This video discusses;
  • Why we need Azure Data Factory
  • What is Azure Data Factory
  • How it works and how it should be used
  • Integration Runtime - installing for connecting with sources
  • Copying On-Premise SQL Server Data to Azure Data Lake using Data Factory

Here is the video;


Here are some sample screens;





Monday, January 15, 2018

Introduction to Azure Data Lake Analytics - and basics of U-SQL

I made another video on Azure Data Lake, specifically on Azure Data Lake Analytics. This is a 40-minutes video and it discusses following items along with demonstrations;

  • What is Azure Data Lake Analytics
  • Data Lake Architecture and how it works
  • Comparison between Azure Data Lake Analytics, HDInsight and Hadoop for processing Big Data.
  • What is U-SQL and basics of it.
  • Demo on How to create an Azure Data Lake Analytics account
  • Demo on How to execute a simple U-SQL using Azure Portal
  • Demo on How to extract multiple files, transform using C# methods and referenced assembly, making multiple results with bit complex transformations using Visual Studio.
Here is the video.



Thursday, January 4, 2018

Introduction to Azure Data Lake Store - Does it replace the Data Warehouse?

I have made a video on Azure Data Lake and Azure Data Lake Store and published in YouTube. This video explains What is Azure Data Lake, Its Characteristics, Components of it, How it works, Usage, Comparison between Data Lake Store and Blob Storage, Modern Data Warehousing and How can we use Azure Data Lake with modern data warehousing.



Have a look on it and let me know you thoughts or questions you have.

Saturday, October 21, 2017

Microsoft Access Database Engine 2016 Redistributable Setup - You cannot install the 32-bit version, You cannot install the 64-bit version

In order to access Excel workbook using Integration Services, one component we need to install is Microsoft Access Database Engine 2016 Redistributable. If 32-bit version of Office is installed, then 32-bit version of Microsoft Access Database Engine 2016 Redistributable should be installed. If 64-bit version of Office is installed, then 64-bit version of it has to be installed.

If it is not installed, this is what we see when try to access an Excel workbook

The requested OLE DB provider Microsoft.ACE.OLEDB.12.0 is not registered. If the 32-bit drivers is no.....


Generally, once it is installed, we can connect to Excel and work with it without any issue. However, I started experiencing below error when I try to install 32-bit version of Microsoft Access Database Engine 2016 Redistributable.

You cannot install the 32-bit version of Microsoft Access Database Engine 2016 because you currently have 64-bit Office products installed. If you want to.....


I was confused with this message because I have installed 32-bit version of Office. However, just for checking, I tried to install 64-bit version of Microsoft Access Database Engine 2016 Redistributable and this is what I got when I tried........

You cannot install the 64-bit version of Microsoft Access Database Engine 2016 because you currently have 32-bit Office products installed. If you want to....

Not sure what is the reason but it stopped me continuing, leaving me only option as uninstalling Office and installing again. Since it is not the best way, just thought to try out older version of Microsoft Access Database Engine Redistributable, hence downloaded 32-bit version of Microsoft Access Database Engine 2010 Redistributable from https://www.microsoft.com/en-us/download/details.aspx?id=13255. Good news is, I was able to install it and the best news is, SSIS could connect to Excel without any issue. This may not be the best solution for above error but for an urgent situation, you can try this and see.

Friday, October 20, 2017

Azure Marketplace - SQL Server 2016 Virtual Machine - Data Tools cannot be installed

As you know, easiest way of getting a SQL Server set up in Azure is, purchasing a Virtual Machine in the Marketplace. There are multiple VMs for SQL Server based on Standard, Enterprise and Developer editions, you can pick the one as per your requirements.

Remember, it does not come with all tools. It has Management Studio installed but it does not have Data Tools installed, hence it needs to be installed.

I had configured a SQL Server 2016 SP1 VM and have been using for sometime. There was no requirement for Data Tools but I experienced an issue when I try to install Data Tools using standard Setup.exe.



I noticed that Setup fails when it tries to download required components from Microsoft site.



......Error 0x80072f08: Failed to send request to URL: https://go.microsoft.com/fwlink/?LinkId=832089&clcid=0x409...............

I did not immediately identify the issue but with my fellow MVPs' help, I realized that it can happen with a block related to security.

If you face the same, make sure that no firewall settings block required site and Administrator (or whoever run) has no restrictions. You can make sure that Administrator has no restrictions by checking Server Manager - Local Server - IE enhanced security configuration.


Wednesday, October 18, 2017

tempdb becomes full when updating Clustered Columnstore index

I had a table with 3.5 billion records and wanted to update one column in all records. This table was not partitioned though the design says that it should be partitioned. With spec of 2 processors (12 cores) and 64 GB RAM, it could never complete the update as it makes the tempdb full, almost 800GB, consuming all space in the drive.

I am still not sure about the exact reason for this but listing out possible reasons that could help you too.
  1. Table is not partitioned. I should have applied partitioning before loading billions of records.
  2. Clustered columnstore index needs to be rebuilt. May it takes long time because the index is fragmented (had no way of checking as everything was urgent).
  3. May be, it is not something to be surprised, update means, delete and insert, that makes records adding delta and then move to rowgroups, hence takes time and needs extra support from tempdb.
  4. Not enough memory for completing the UPDATE process, hence use tempdb.
However, I manage to execute the UPDATE as a batch process. Although it took a significant time, it did not use tempdb (it might have slightly used but did not notice).

Here is the code I finally used;

WHILE EXISTS (SELECT * FROM dbo.Table01 WHERE Column1 IS NULL)
BEGIN

 UPDATE TOP (1000000) dbo.Table01
  SET Column1 = RIGHT(Column2, 4)
 WHERE Column1 IS NULL
END

Monday, October 16, 2017

SQL Server Always Encrypted - Serving applications using Azure Key Vault and Certificate Store

Here is my second video related to Always Encrypted. This video discusses the way of configuring CMK for serving multiple applications hosted in multiple places, using Local Key Store - Certificate Store and Centralized Key Store - Azure Key Vault.


Here are the codes related to the video. I create a table in my Azure SQL Database, this is the code for it;

CREATE TABLE dbo.Message
(
 MessageId int identity(1,1) primary key
 , MessageCode char(5) COLLATE Latin1_General_BIN2 not null 
 , Message varchar(4000)  COLLATE Latin1_General_BIN2 not null
);
GO

INSERT INTO dbo.Message (MessageCode, Message)
 VALUES ('AA56B', 'This is a test message');
GO

CREATE OR ALTER PROCEDURE dbo.AddMessage @MessageCode char(5)
  , @Message varchar(4000)
AS
BEGIN

 INSERT INTO dbo.Message
  (MessageCode, Message)
 VALUES
  (@MessageCode, @Message);
END
GO


CREATE OR ALTER PROCEDURE dbo.GetMessage @MessageCode char(5)
   , @Message varchar(4000) OUTPUT
AS
BEGIN

 SELECT @Message = Message 
 FROM dbo.Message
 WHERE @MessageCode = MessageCode;
END
GO

And then, I have a .Net Application that accesses Azure Key Vault for taking the CMK and inserting and updating records. Here is the code of it. 

private static ClientCredential _clientCredential;

static void InitializeAzureKeyVaultProvider()
{
    _clientCredential = new ClientCredential("9e67ee1f-50ef-49d1-8ee0-0c48eaf4457b", "HjRqkx7BKLP7Lu+UYgTa5D/zCKAdxx3YITQ6fRrsQTQ=");

    SqlColumnEncryptionAzureKeyVaultProvider azureKeyVaultProvider =
        new SqlColumnEncryptionAzureKeyVaultProvider(GetToken);

    Dictionary providers =
        new Dictionary();

    providers.Add(SqlColumnEncryptionAzureKeyVaultProvider.ProviderName, azureKeyVaultProvider);
    SqlConnection.RegisterColumnEncryptionKeyStoreProviders(providers);
}

public async static Task GetToken(string authority, string resource, string scope)
{
    var authContext = new AuthenticationContext(authority);
    AuthenticationResult result = await authContext.AcquireTokenAsync(resource, _clientCredential);

    if (result == null)
        throw new InvalidOperationException("Failed to obtain the access token");
    return result.AccessToken;
}

public Form1()
{
    InitializeComponent();
    InitializeAzureKeyVaultProvider();

}

private void buttonSubmit_Click(object sender, EventArgs e)
{
    SqlConnection connection = new SqlConnection("Password=Pa$$w0rd;Persist Security Info=True;User ID=Dinesh;Initial Catalog=Marketing;Data Source=dinesqlserver.database.windows.net;Column Encryption Setting = Enabled");
    SqlCommand command = new SqlCommand("AddMessage", connection);
    command.CommandType = CommandType.StoredProcedure;

    SqlParameter parameterMessageCode = new SqlParameter("MessageCode", SqlDbType.Char, 5);
    parameterMessageCode.Value = textBoxMessageCode.Text;

    SqlParameter parameterMessage = new SqlParameter("Message", SqlDbType.VarChar, 4000);
    parameterMessage.Value = textBoxMessage.Text;

    command.Parameters.Add(parameterMessageCode);
    command.Parameters.Add(parameterMessage);

    connection.Open();
    command.ExecuteScalar();
    connection.Close();

}

private void buttonGet_Click(object sender, EventArgs e)
{
    SqlConnection connection = new SqlConnection("Password=Pa$$w0rd;Persist Security Info=True;User ID=Dinesh;Initial Catalog=Marketing;Data Source=dinesqlserver.database.windows.net; Column Encryption Setting = Enabled");
    SqlCommand command = new SqlCommand("GetMessage", connection);
    command.CommandType = CommandType.StoredProcedure;

    SqlParameter parameterMessageCode = new SqlParameter("MessageCode", SqlDbType.Char, 5);
    parameterMessageCode.Value = textBoxMessageCode.Text;

    SqlParameter parameterMessage = new SqlParameter("Message", SqlDbType.VarChar, 4000);
    parameterMessage.Direction = ParameterDirection.Output;

    command.Parameters.Add(parameterMessageCode);
    command.Parameters.Add(parameterMessage);

    connection.Open();
    command.ExecuteScalar();
    connection.Close();

    MessageBox.Show(parameterMessage.Value.ToString());
}

You can find more information and standard codes related to Azure Key Vault usage with Always Encrypted at: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-always-encrypted-azure-key-vault


Thursday, October 12, 2017

Introduction to Always Encrypted

Here is my second video that speaks about Always Encrypted feature. This video shows How Always Encrypted works, how it should be configured and how we can access and update date using a .Net application.


Monday, October 2, 2017

Introduction to Azure HDInsight

I have written few posts on HDInsight and thought to make series of Video on it as well. Here is the first one, with this video, I discuss, what is Hadoop, What is HDInsight, Configuration of HDInsight, Important Settings related to HDInsight and how to access the cluster using Azure Portal and PuTTY.

Watch and see!


Monday, September 18, 2017

Self-Service Business Intelligence with Power BI


Understanding Business Intelligence

There was a time that Business Intelligence (BI) had been marked as a Luxury Facility that was limited to the higher management of the organization. It was used for making Strategic Business Decisions and it did not involve or support on operational level decision making. However, along with significant improvement on data accessibility and demand for big data, data science and predictive analytics, BI has become very much part of business vocabulary. Since modern technology now makes previously-impossible-to-access data, ever-increasing data and previously-unknown data available, and business can use BI with every corner related to the business for reacting fast on customers’ demand, changes in the market and competing with competitors.

The usage of Business Intelligence may vary from organization to another. Organizations work with large number of transactions need to analyze continuous, never-ending transactions, almost near-real-time, for understanding the trends, patterns and habits for better competitiveness. Seeing what we like to buy, enticing us to buy something, offering high-demand items with lower-price as a bundled item in a supermarket or commercial web site are some of the examples for usage of BI. Not only that, the demand on data mashups, dashboards, analytical reports by smart business users over traditional production and formal reports is another scenario where we see the usage of BI. This indicates increasing adoption of BI and how it assists to run the operations and survive in the competitive market. 

The implementation of traditional BI (or Corporate BI) is an art. It involves with multiple steps and stages and requires multiple skills and expertise. Although a BI project is considered and treated as a business solution than a technical/software solution, this is completely an IT-Driven solution that requires major implementations like ETLing, relational data warehouse and multi-dimensional data warehouse along with many other tiny components. This talks about an enormous cost. The complexity of the solution, time it takes and the finance support it needs, make the implementation tough and lead to failures but it is still required and demand is high. This need pushed us to another type of implementation called Self-Service Business Intelligence.

Self-Service Business Intelligence

It is all about empowering the business user with rich and fully-fledge equipment for satisfying their own data and analytical requirements. It is not something new, though the term were not used. Ever since the spreadsheet was introduced and business users manipulate data with it for various analysis, it was time that Self-Service BI appeared. Self-Service BI supports data extractions from sources, transformations based on business rules, creating presentations and performing analytics. Old-Fashioned Self-Service BI was limited and required some technical knowledge but availability of modern out-of-the-box solutions have enriched the facilities and functionalities, increasing the trend toward to Self-Service BI.

BI needs Data Models. Data Model can be considered as a consistence-view (or visual representations) of data elements along with their relationships to the business. However, data models created by developers are not exactly the model required for BI. The data model created by developer is a relational model that breaks entities to multiple parts but the model created to BI has real-world entities giving a meaning to data. This is called as Semantic Model. Once it is created, it can be used by users easily as it is a self-descriptive model, for performing data analytics, creating reports using multiple visualizations and creating dashboards that represent key information of the organization. Modern Self-Service BI tools support creating models, hence smart business users can create models as per requirements without a help from IT department.

Microsoft Power BI


Microsoft offers few products for supporting Self-Service BI. Microsoft Office Excel is the most widely-used spreadsheet software and with recent addition of four power tools: Power Pivot, Power Query, Power View and Power Map, it has become one of key software for performing Self-Service BI. In addition to that, Microsoft Reporting Services (specifically Report Builder) and SharePoint Services support Self-Service BI allowing users to perform some Self-Service BI operations.

Microsoft Power BI is the latest from Microsoft; a suite of business analytical tools. This addresses almost all Self-Service BI needs, starting from gathering data, preparation for analysis and presenting and analyzing data as required.

Power BI comes in two flavors: Power BI Desktop and Power BI Service. Power BI Desktop is a stand-alone tool that allows us to connect with hundreds of data sources, internal or external, structured or unstructured, model loaded data as per the requirements, apply transformations to adjust data and create reports with stunning visuals. Power BI Service can hold reports created by individuals as a global repository allowing us to share reports among others, create reports using shared/uploaded datasets and creating personalized dashboards.

Business Users who love Excel might see this as another tool that offers the same Excel functionalities. But it offers much more than Excel in terms of BI. Of course, you might see some functionalities that are available in Excel but not in Power BI. The biggest issue we see related to this is, entering data manually. Excel supports entering data, providing many facilities for data-entry but Power BI support on it is limited. Considering these facts, there can be certain scenario where Power BI cannot be used over Excel but in most cases, considering Self-Service BI, it is possible. The transition is not difficult, it is straight forward as Power BI easily corporates with Excel in many ways. This article gives a great introduction to using Power BI within Excel – Introduction to PowerPivot and Power BI. 

In order to perform data analysis or create reports, an appropriate data model with necessary data should be created. With traditional BI, relational and OLAP data warehouses fulfil this requirement but creating the same with Self-Service BI is a challenge. Power BI makes this possible by allowing us to connect with multiple data sources using queries and bring them together in the data model. Regardless of the type of data sources, Power BI supports creating relationships between sources with their original structures. Not only that, transformations on loaded data, creating calculated tables, columns and hierarchies are possible with Power BI data models. 

What about ETLing? How it is being handled with Self-Service BI or Power BI? This is the most complex and time consuming task with traditional BI and the major component of it is transformation. Although some transformations can be done in Power BI Data Model, advanced transformations have to be done in Query Editor that comprise more sophisticated tools for transformations. 

Selecting the best visual for presenting data and configuring it appropriately is the key of visualization. Power BI has plenty of visuals that can be used for creating reports. In addition to the visuals given, there are many other visuals available free, created and published by community supporters. If a specific visual is required, it can be downloaded and added easily.

Visualization is done with Power BI Report. Once they are created, they can be published to Power BI Service. Shared reports can be viewed via Power BI Service portal and native mobile apps. Not only that, Power BI Service portal allows users to create dashboards, pinning visuals in multiple reports to a single dashboard, extending Self-Service BI.

In some occasions, model is already created either as a relational model or OLAP model by IT department. When the model is available, users do not need to create the model for creating reports, they can directly connect with the model and consume data for creating reports. This reduces the space need to consume as user does not need to import data and offer real-time BI. Power BI supports making direct connections to certain relational data sources using Direct Query method and OLAP data sources using Live Connection method. 

Power BI is becoming the best Self-Service BI tool in the market. Microsoft is working hard on this, it can be clearly witnessed with monthly releases of Power BI Desktop and weekly upgrades of Power BI Service. If you have not started working with Power BI yet, register at http://www.powerbi.com and start using it. You will see Business Intelligence like never before.


Sunday, September 17, 2017

SQL Server 2016 - CREATE/ALTER PROCEDURE must be the first statement in a query batch

Everyone knows that SQL Server started supporting CREATE OR ALTER statement with SQL Server 2016 and it can be used with few objects. However, with my new installation of SQL Server 2016, I noticed this error which was not supposed to see with 2016.


What could be the reason? I could not immediately realize the issue and was thinking that what went wrong. It should be working as this is SQL Server 2016.

After few minutes, I realized the issue;


It is all about the version I used. This feature was introduced with SQL Server 2016 - Service Pack I and I had not installed SP1 for my new instance. You may experience the same, hence made this post to note it :).

Wednesday, September 13, 2017

SQL Server TRY_CONVERT returns an error instead of NULL

Can this be happened? The functions TRY_CONVERT and TRY_CAST have been designed for handling errors and they should return NULL when there is an error.


But, it is not always true.


What is the reason? Although it returns NULL when an error occurred with the conversion, it does not happen if the conversion is not a permittable. The above code tries to convert an integer into an unique identifier that is not permitted, hence it fails with an error. 

Tuesday, September 12, 2017

Run Hive Queries using Visual Studio

Once HDInsight cluster is configured, we generally use either the portal dashboard (Powered by Ambari) or a tool like PuTTY for executing queries against data loaded. Although they are not exactly a developer related tools, or in other words, not an IDE, we had to use because we did not have much options. However, now we can use the IDE we have been using for years for connecting with HDInsight and executing various types of queries such as Hive, Pig and USQL. It is Visual Studio.

Let's see how we can use Visual Studio for accessing HDInsight.

Making Visual Studio read for HDInsight

In order to work with HDInsight using Visual Studio, you need to install few tools on Visual Studio. Here are the supported versions;
  • Visual Studio 2013 Community/Professional/Premium/Ultimate with Update 4
  • Visual Studio 2015 any edition
  • Visual Studio 2017 any edition
You need to make sure that you have installed Azure SDK on your Visual Studio. Click here for downloading the Web Platform Installer and make sure following are installed;


This installs Microsoft Azure Data Lake Tools for Visual Studio as well, make sure it is installed.


Now your Visual Studio is ready for accessing HDInsight. 

Connecting with HDInsight

Good thing is, you can connect with your cluster even without creating a project. However, once the SDK is installed, you can see new Templates called Azure Data Lake - HIVE (HDInsight), Pig (HDInsight), Storm (HDInsight) and USQL (ADLA) and HIVE template can be used for creating a project.


Project creates one hql file for you and you can use it from executing your Hive Queries. In addition to that, You can open Server Explorer (View Menu -> Server Explorer), and expand Azure (or connect to your Azure account and then expand) for seeing all components related to Azure.


As you see, you can see all databases, internal and external tables, views and columns. Not only that, by right-clicking the cluster, you can open a windows for writing a query or viewing jobs. Here is the screen when I use the first option that is Write a Hive Query.


Did you notice Intelli-Sense? Yes, it supports with almost all metadata, hence it is really easy to write a query.

Executing Queries

If you need to see records in tables without limiting data with predicates or constructing the query with additional functions, you can simply right-click on the table in Server Explorer and select View top 100 Rows


If you need to construct a query, then use the above method for opening a window and write the query. There are two ways of executing the code: Batch and Interactive. Batch mode does not give you the result immediately but you will be able to see or download once the job submitted is completed. If you use the Interactive, then it is similar to SSMS result.


If you use the Batch mode, you can see the way job is getting executed. Once the job is completed, you can click on Job Output for seeing or downloading the output.


As you see, there is no graphical interface to see the job execution. Visual Studio will show the job execution using a graphical interface only when the job is executed by Tez Engine. Remember, HDInsight will always use Tez Engine to execute Hive Queries but simpler queries will be executed using Map Reduce Engine.

See this query that has some computation;


Can we create table with this IDE?
Yes, it is possible. You can right-click on the your database in Azure Server Explorer and select Create table menu item. 


Let's talk about more on this with later posts.