A fully-fledged Business Intelligence system never ignore unstructured data. The reason is, you can never get the true insight without considering, consuming and processing all types of data available in an organization. If you design a BI solution and if you have both structured and unstructured data, how do you plan to process them?
Generally, processing unstructured data is still belong to Hadoop ecosystem. It is designed for that, and it is always better to handover the responsibility to Hadoop. We do BI using Microsoft SQL Server product suite, and SQL Server Integration Services (SSIS) is the component we use for handling ETLing. If there is an unstructured data set that needs to be processed as a part of ETL process, how can you get the support from Hadoop via SSIS for processing unstructured data and getting them back as structured data? The solution is given with Integration Services Feature Pack for Azure.
The Integration Services Feature Pack for Azure (download from here) provides us functionalities for connecting with Azure Storage and HDInsight for transferring data between Azure Storage and On-Premise data sources, and processing data using HDInsight. Once installed, you see newly added tasks in both Control Flow and Data Flow as below;
Assume that you have a file which is unstructured and it needs to be processed. And once processed you need to get the result back to your data warehouse. For implementing this using SSIS, you need to do following;
- Install Integration Services Feature Pack for Azure (download from here)
- Install Microsoft Hive ODBC Driver (download from here)
- Generate a Self-Signed Certificate and upload to Azure Subscription.
Why we need Microsoft Hive ODBC Driver? We need this for connecting with Hive using ODBC. There are two drivers; one for 32-bit applications and other is for 64-bit applications. In order to use it with Visual Studio, you need 32-bit driver. It is always better to have both installed and when creating DSN, create the same DSN in both 32-bit and 64-bit System DSN. This is how you have to configure DSN for Hive in Azure HDInsight.
Open ODBC Source Administrator (32-bit) application and click Add button in System DSN tab.
Select the driver as Microsoft Hive ODBC Driver and configure it as below;
You can click on Test button for testing the connection. If you have not configured the cluster yet (in my case, it is not, we will be creating using SSIS), you will get an error. But still you can save it keep it. Note that it is always better to create another DSN using 64-bit ODBC Data Source Administrator with the same name.
Next step is, generating the certificate and add it Azure subscription. This is required for connecting to Azure using Visual Studio. Easiest way of doing this is, creating the certificate using Internet Information Services (IIS), export it using certmgr and upload it using Azure Classic Portal. Let's do it.
Open IIS Manager and go for Server Certificates. Click on Create Self-Signed Certificate for generating a certificate.
Once created, open Manage Computer Certificates and start Export Wizard.
Select Base-64 Encoded X.509 (.CER) option in Export File Format and continue.
Next page asks you the file name for the certificate. Type the same name with the location you need to save it and complete the wizard.
Everything needs by SSIS for working with Azure Storage and HDI is done. Now let's make the SSIS package to upload an unstructured data, create a HDI cluster on-demand, process uploaded file using Hive, download the result, and then finally remove the cluster because you do not need to pay extra to Microsoft.
Since the post it bit lengthy, let me make the next part as a new post. Read it from below link:
Creating HDInsight Hadoop Cluster using SSIS and processing unstructured data using Hive Task - Azure Feature Pack - Part II
The wizard starts with welcome page and then open Export Private Key page. Select No, do not export private key option and continue.
Select Base-64 Encoded X.509 (.CER) option in Export File Format and continue.
Next page asks you the file name for the certificate. Type the same name with the location you need to save it and complete the wizard.
Certificate is exported with the required format. Now you need to upload it to Azure Subscription. Open Azure New Portal and then open Azure Classic Portal. Scroll down the items and fins Settings, and click on it.
Settings page has Manage Certificate section. Click on it for opening it and upload the certificate created.
Everything needs by SSIS for working with Azure Storage and HDI is done. Now let's make the SSIS package to upload an unstructured data, create a HDI cluster on-demand, process uploaded file using Hive, download the result, and then finally remove the cluster because you do not need to pay extra to Microsoft.
Since the post it bit lengthy, let me make the next part as a new post. Read it from below link:
Creating HDInsight Hadoop Cluster using SSIS and processing unstructured data using Hive Task - Azure Feature Pack - Part II
No comments:
Post a Comment