Showing posts with label Error. Show all posts
Showing posts with label Error. Show all posts

Saturday, October 21, 2017

Microsoft Access Database Engine 2016 Redistributable Setup - You cannot install the 32-bit version, You cannot install the 64-bit version

In order to access Excel workbook using Integration Services, one component we need to install is Microsoft Access Database Engine 2016 Redistributable. If 32-bit version of Office is installed, then 32-bit version of Microsoft Access Database Engine 2016 Redistributable should be installed. If 64-bit version of Office is installed, then 64-bit version of it has to be installed.

If it is not installed, this is what we see when try to access an Excel workbook

The requested OLE DB provider Microsoft.ACE.OLEDB.12.0 is not registered. If the 32-bit drivers is no.....


Generally, once it is installed, we can connect to Excel and work with it without any issue. However, I started experiencing below error when I try to install 32-bit version of Microsoft Access Database Engine 2016 Redistributable.

You cannot install the 32-bit version of Microsoft Access Database Engine 2016 because you currently have 64-bit Office products installed. If you want to.....


I was confused with this message because I have installed 32-bit version of Office. However, just for checking, I tried to install 64-bit version of Microsoft Access Database Engine 2016 Redistributable and this is what I got when I tried........

You cannot install the 64-bit version of Microsoft Access Database Engine 2016 because you currently have 32-bit Office products installed. If you want to....

Not sure what is the reason but it stopped me continuing, leaving me only option as uninstalling Office and installing again. Since it is not the best way, just thought to try out older version of Microsoft Access Database Engine Redistributable, hence downloaded 32-bit version of Microsoft Access Database Engine 2010 Redistributable from https://www.microsoft.com/en-us/download/details.aspx?id=13255. Good news is, I was able to install it and the best news is, SSIS could connect to Excel without any issue. This may not be the best solution for above error but for an urgent situation, you can try this and see.

Monday, July 31, 2017

SQL Server bcp error - String data, right truncation

We generally expect a proper row terminator with files loaded using bcp and with all accurately defined switches, data should be loaded without any issue. But sometime, you may see issues like below even if you try with switches like -r "\r\n" or -r "\n".


In most cases, the reason is the row terminator. If you open the file to be loaded using something like Notepad++, and enable Show All Characters (icon is in the toolbox), you should see the exact row terminator.


If you see something like above, then you need to add the row terminator as a hexadecimal value. In this case, if I add -r "0x0a", then my data loading will work without any issue.

Friday, June 2, 2017

Power BI does not show all columns in SQL Server View

I experienced a strange issue with Power BI today, I might have missed something or there is something unknown to me. This is what I experienced; Power BI does not show all columns when trying to import records from a SQL Server view.

Just for simulating the issue, I created a view with AdventureWorks2014, using three tables, Product, ProductSubCategory, and ProductCategory, and named as vProduct.

USE [AdventureWorks2014]
GO

CREATE OR ALTER VIEW dbo.vProduct
AS
 SELECT p.ProductID, p.Name Product, s.Name SubCategory, c.Name Category
 FROM Production.Product p
  INNER JOIN Production.ProductSubcategory s
   ON p.ProductSubcategoryID = s.ProductSubcategoryID
  INNER JOIN Production.ProductCategory c
   ON s.ProductCategoryID = c.ProductCategoryID;
GO

Then I granted SELECT permission to Jack user.

GRANT SELECT ON dbo.vProduct TO Jack;
GO

If Jack executes a SELECT against the view, he sees records with all columns.


Now, if Jack connects to the database with Power BI, this is what he sees.


Anyone has experienced this? I made a search but could not find a solution. I might have made a mistake or missed something but could not figure it out. Appreciate if you can share your thoughts or a solution if you have already seen this and sorted out.

I used the latest Power BI version that is May 2017.

Note that I see the same result for both Import and Direct Query.

Monday, May 15, 2017

Reporting Services - An error occurred within the report server database....Database 'ReportServerTempDB' cannot be opened

Yesterday, I posted on an error related to Reporting Services. Here is another error related to Reporting Services encountered at one of my client sites. When the Reporting Services portal is visited, it says;


When the Web Service is visited, it clears says that something is wrong with the Reporting Services Temp DB.

An error occurred within the report server database. This may be due to a connection failure, timeout or low disk condition within the database. (rsReportServerDatabaseError) Get Online Help
Database 'ReportServerTempDB' cannot be opened due to inaccessible files or insufficient memory or disk space. See the SQL Server errorlog for details.

This says that we need to check the database immediately. When it was viewed via Management Studio, we noticed that ReportSererTempDB is missing. It can happen for various reasons. However, if you are experiencing the same, you may be able to fix it without doing much. We fixed it without much effort, steps used for fixing are listed below;

For simulating the same, I deleted the mdf file related to ReportServerTempDB and this is how the database is shown with my server now. It is similar to the issue faced by my client.


What should we do? Here are the steps;

1. Backup the Report Server database if you have no backups (In my case, it is ReportServer)

2. Open Reporting Services Configuration Manager and go to Database. Click on Change Database.


3. Wizard starts. Select Create a new report server database and click Next to continue.


4. Enter a new name for the database. Note that the original database (ReportServer) is not deleted and we are not replacing it. This creates two new databases; ReportServer1 and ReportServer1Temp. Complete the wizard.


5. Once completed, you should see two new database. Note the old database too.


6. Since it is a new database now, Portal does not show the reports we had. We need to bring the old database back.


7. Go to Management Studio and rename the current Report Server temp database to old name. If the old Report Server Temp database is still exist, it has to be deleted first. Once renamed, I will be having my databases as below.


8. Now we need to go back to Reporting Services Configuration Manager and start Change Database again. Select Choose an existing report server database.


8. Select the old report server database and complete the wizard.


9. Once done, you should be able to see your old content. You may delete the report server database created for fixing this (in my case, it is ReportServer1 database)


Sunday, May 14, 2017

Reporting Services - HTTP Error 404. The requested resource is not found.

What if you suddenly experience this when you open the Reporting Services Portal?

HTTP Error 404. The requested resource is not found.


There can be many reasons for this error and you may find the exact reason by reading Windows Application Log, Report Server Service Trace Log (default location is C:\Program Files\Microsoft SQL Server\MSRS13.MSSQLSERVER\Reporting Services\LogFiles) or SQL Server Error Log. When I go through the trace, I found that it was related to HttpListner and permission.

What shall we do? It is always better to find the exact root-cause and fix it. But, if you need to do a quick fix, there is a simple way to try and see.

It is all about change the Virtual Directories of Reporting Services. You may not fix the issue with this method but can try and see;

1. Open the Reporting Services Configuration Manager.

2. Go to Web Service URL and change the name of the Virtual Directory. In my case, it was ReportServer and I change it as ReportServer1.


3. Do the same for Web Portal URL too.


4. Once this is done, you should be able to open the portal. As you see, nothing has happened to content of it, it shows all items it had.


If you need to set the same Virtual Directories you had, you can do the same again for renaming them.

** Note that, you need to keep regular back of your report database and all content of report server. If you cannot get your errors fix, only option is reinstalling and setting it up again.

Friday, February 10, 2017

SQL Server Data Tools (Visual Studio) - The length of the full path for the solution, project or item you are creating exceeds the maximum path length allowed

While trying to create a new SSIS project with default values for Project Name, Location and Solution Name, I experienced the following limitation;

The length of the full path for the solution, project or item you are creating exceeds the maximum path length allowed by the system. You must reduce the length of the name or the location.


This is not something new but I did not experienced this for a long time. What is the reason for this?

This is because of the limitation with the OS. The limitation is set for 260 characters for the path, hence when exceeds, Visual Studio shows this message without letting us to create the project. There are two ways I tried to overcome this issue;
  1. Of course, the first way is, select a path that can be referred with less number of characters.
  2. Second one is, creating the project without creating the solution. For that, we need to un-check the Create directory for solution Checkbox and then create the project. Once created, we can open it and add it a solution.
This checking is based on following calculation;

Path length + 1 (separator) + 
Solution name length + 1 (separator) + 
Project name length + 1 (separator) + 
Project name length + 
80 (Reserved space)



Friday, December 30, 2016

Inserting records, ignoring duplicates, recording them

A column like email address in en entity like Customer or Employee is always set with an UNIQUE Constraint to make sure that no duplicates values are inserted. This is achieved through an Unique Index setting up with the column. This does not annoy us when we accept only one record at a time for inserting or updating but it surely annoys us when we perform a Bulk Operation.

There are instances that we load large number of records as a single batch. For an example, if we get 1 million records to be inserted to a table like Customer that has Email Column, if one duplicate is detected within the batch, the whole batch is getting rolled back. Sometimes, we need to continue with all other records ignoring duplicates but the standard Unique Index does not allow us to do it.

Here is an example.

USE tempdb;
GO

-- Creating a table for Customers
IF OBJECT_ID('dbo.Customers') IS NOT NULL
 DROP TABLE dbo.Customers;
GO
CREATE TABLE Customers
(
 Title char(5) NOT NULL
 , FirstName varchar(50) NOT NULL
 , MiddleName varchar(50)  NULL
 , LastName varchar(50) NOT NULL
 , EmailAddress varchar(100) NOT NULL
 
)

-- Addin the unique index on the Email Column
CREATE UNIQUE INDEX IX_Customers ON dbo.Customers (EmailAddress)

-- Inserting one customer for testing
INSERT INTO dbo.Customers
 (Title, FirstName, MiddleName, LastName, EmailAddress)
VALUES
 ('Mr.', 'Dinesh', NULL, 'Priyankara', 'dinesh@dinesql.com')

If we try to insert records as a batch that has duplicates, none of them will be inserted.

-- This batch contains 4 records, note the 2nd and 3rd record
-- They have the same emial address
INSERT INTO dbo.Customers
 (Title, FirstName, MiddleName, LastName, EmailAddress)
SELECT 'Mr.', 'Yeshan', NULL, 'Santhush', 'yeshan@dinesql.com'
UNION ALL
SELECT 'Ms.', 'Jane', NULL, 'Western', 'jane@dinesql.com'
UNION ALL
SELECT 'Ms.', 'Jane', NULL, 'Black', 'jane@dinesql.com'
UNION ALL
SELECT 'Mr.', 'Jack', NULL, 'Knight', 'jack@dinesql.com'

The above operation results this;

Msg 2601, Level 14, State 1, Line 31
Cannot insert duplicate key row in object 'dbo.Customers' with unique index 'IX_Customers'. 
The duplicate key value is (jane@dinesql.com).
The statement has been terminated.

If we need to make sure that the operation ignores duplicates and continues, all we have to do is, add IGNORE_DUP_KEY option to the index.

-- dropping and recreating the index with IGNORE_DUP_KEY option
DROP INDEX IX_Customers ON dbo.Customers 
GO

CREATE UNIQUE INDEX IX_Customers ON dbo.Customers (EmailAddress) WITH (IGNORE_DUP_KEY = ON)

-- checking the insert again. It inserts all except the duplicate
INSERT INTO dbo.Customers
 (Title, FirstName, MiddleName, LastName, EmailAddress)
SELECT 'Mr.', 'Yeshan', NULL, 'Santhush', 'yeshan@dinesql.com'
UNION ALL
SELECT 'Ms.', 'Jane', NULL, 'Western', 'jane@dinesql.com'
UNION ALL
SELECT 'Ms.', 'Jane', NULL, 'Black', 'jane@dinesql.com'
UNION ALL
SELECT 'Mr.', 'Jack', NULL, 'Knight', 'jack@dinesql.com'

Though this works fine, an obvious question comes up is, how do we know what records have been ignored? There is no direct method to capture ignored records with this solution however following can be used with some additional coding;
  1. Load all email addresses to a Temporary Table or Table Variable before performing the main operation. After main Insert Operation, do a comparison using LEFT OUTER JOIN to find out records that were not inserted.
  2. Use Integration Services for the operation. It has allows you to capture problematic records using error handlers or can duplicate the flow, record all emails addresses in another container, and do the comparison at the end of the task.

Wednesday, December 28, 2016

Reporting Services (SSRS) 2016 - Could not load folder contents Error - Even with SP1

I have been continuously hitting with this when the Reporting Services portal is opened first time;

Could not load folders contents
Something went wrong, Please try again later.


But if I refresh the page or reload the portal, everything starts working fine and it does not appear until I restart the service.

Not sure the reason, searched and read many posts, contacted some consultants but so far no luck. I tried with Recreating the database, Changing the credentials, Changing the TimeZone to UTC 0, But none of them worked for me.

As per the log, it fails first few attempts to connect with the database but I am not sure how it connects after (because my second attempt loads the portal).

Still expecting some answers for experts, will be updating this post once I have the answer, appreciate if can share your solution, if you have already faced this and solved.

Wednesday, December 14, 2016

Management Studio Error: The Visual Studio component cache is out of date. Please restart Visual Studio.

I experienced this error when opening Management Studio just after the installation of Windows update.


Not sure about the reason but I got a similar error when opening Visual Studio as well. I could not find the exact reason for this and closing and re-opening did not help either. I just tried with killing all Visual Studio related tasks using Task Manager and then tried, yes, it worked.

Not sure whether you faced this with Management Studio before. If someone has more details on this specifically to SQL Server, please share.

Friday, November 25, 2016

SSIS OLE DB Destination sends all records to Error Output

SSIS OLE DB Destination is one of the destinations we heavily used for loading data to databases. Generally we used Data access mode - fast load for loading data because it is the most efficient and fastest way of loading data. However with default setting, you may see some issues if you try to capture all error records while loading the destination unless you make some changes to some settings.

Let's make a simple data set and try to load to a table. I have following table created in tempdb.

USE tempdb

CREATE TABLE dbo.Product
(
 ProductId int IDENTITY(1,1) NOT NULL,
 ProductName varchar(200) NOT NULL,
 Class char(1) NOT NULL,
 Cost money NOT NULL,
 PRIMARY KEY CLUSTERED (ProductId ASC)
)
GO

ALTER TABLE dbo.Product  
ADD  CONSTRAINT CH_Product_Class 
CHECK  ((Class='A' OR Class='a' OR Class='B' OR Class='b'))
GO

ALTER TABLE dbo.Product  
ADD  CONSTRAINT CH_Product_Cost CHECK  ((Cost>(0)))
GO

I have a text file named Products.txt that contains 15 records as below;

ProductName,Class,Cost
P01,A,100
P02,B,150
P03,A,344
P04,A,555
P05,A,100
P06,B,150
P07,X,344
P08,A,555
P09,A,555
P10,A,100
P11,B,150
P12,A,344
P13,A,555
P14,A,344
P15,A,555

As you see, P07 record is not a valid record as it has X as the Class. Let's set up a package with a Data Flow Task and configure it to read data from this file and load into Product table as below.

I have used OLE DB Destination with its default settings.


And have configured the Error Output to Redirect Row on Error.


Then I have set a Record Destination just for capturing the error output. If you do, you can configure DataReader Destination as well. This is Data Flow looks like;


Now if I run the package, packages retrieves all records but it send all records to Recordset Destination even though there is only one record that violates a constraint.


What is the reason for this. The reason is the value we have set (or the default value) for Maximum insert commit size property. This property indicates the batch size that the OLE DB Destination tries to commit data during the operation. Since we have only 15 records and the batch size set is 2147483647, it tries to commit all at once. Since there is an invalid record, entire batch is getting failed and none of the record is getting inserted (or committed).

This is fine with a small dataset like this but this is not acceptable for a large data set. Hence we need to make sure that it does not send all records to error flow but an acceptable range.

For this, let's make it as 5 and see.


Here is the result of execution after changes;


This makes sure that at least certain batches are getting inserted without sending all records to error flow.


However, you need to decide the batch size without degrading the performance. If you set it for a small value, you may not get the expected performance form fast-load option.


Thursday, November 24, 2016

SSIS Exception deserializing the package The process cannot access the file .ispac because it is being used by another process

For some reasons, my  SQL Server Data Tools were crashing on a specific package and I could not get it up and running after that until I restarted the machine. However, I started experiencing the same again (not sure about the reason) and this is what I got when I tried to open the package and execute after the crash.

Exception deserializing the package "The process cannot access the file 'C:\Users\DineshPriyankara\Documents\Visual Studio 2015\Projects\TestDataFlowErrors\TestDataFlowErrors\bin\Development\TestDataFlowErrors.ispac' because it is being used by another process.".


When I checked my Task Manager, I noticed that few SSIS Debug Host are running;


Since I was not running any SSIS project, decided to kill all SSIS related task and of course, problem solved with it. I was able to open and run the package without any issue.

Now, not sure whether it is the best way of solving the issue, have asked from some experts, waiting for some explanations....

Thursday, September 22, 2016

Cannot execute as the database principal because the principal does not exist - Mismatched Security IDs

Are you experiencing this error with newly restored database?

Cannot execute as the database principal because the principal "abc" does not exist, this type of principal cannot be impersonated, or you do not have permission.

SQL Server throws this error when a user tries to execute a code but SQL Server cannot find the  login account. This can happen when a database from an instance (let's say Server-01) is restored to a different instance (Server-02) of SQL Server because it breaks the link between user accounts and login accounts. Example, assume that Server-01 has a database called Sales, and this has a login called Jane and user called Jane in Sales database, linked with Jane Login. If the database is restored to Server-02, you will see above-mentioned error because;
  1. Jane Login is exist in the Server-02 but it has a different SID that does not match with SID of user Jane in Sales database.
  2. Jane Login does not exist.
If the reason is the #1, then you can re-map it or if the reason is #2, then it needs to be created and re-mapped.

Here is an example. This example shows one way of seeing the error and the way of fixing it. The following code;
  • Creates a database named Sales.
  • Creates a login called Jane and add Jane as a user to Sales database.
  • Configure appropriate permissions to Jane for reading data.
  • Take a backup of the Sales database.
  • Delete the Sales database.
  • Delete the Jane Login.
  • Restore the Sales database.
  • Check and see whether Jane can access the database.
  • Create Jane's login again and try.
  • Find out orphaned users.
  • Map Jane's user with the newly created login.
USE master;
GO

-- create the database
CREATE DATABASE Sales
GO

-- create the login
CREATE LOGIN Jane WITH PASSWORD = 'Pa$$w0rd';

-- connect with Sales and add Jane as a user
USE Sales;
GO

-- create user
CREATE USER Jane FOR LOGIN Jane;
-- give permission for reading.
EXEC sp_addrolemember 'db_datareader', 'Jane';

-- change the security context and execute a code
-- Jane can read records
EXECUTE AS USER = 'Jane';
SELECT USER_NAME()
SELECT * FROM sys.objects;
REVERT;

-- Connect to master database
USE master;
GO

-- take a backup and delete the database and login
BACKUP DATABASE Sales TO  DISK = N'D:\MSSQL13.MSSQLSERVER\MSSQL\Backup\Sales.bak' 
WITH NOFORMAT, NOINIT

DROP DATABASE Sales;

DROP LOGIN Jane;

-- restored the database again. Note that
-- this has the user but instance does not have the login
RESTORE DATABASE Sales FROM DISK = N'D:\MSSQL13.MSSQLSERVER\MSSQL\Backup\Sales.bak';

-- Connect with Sales
USE Sales;
GO

-- This throws the error
EXECUTE AS USER = 'Jane';
SELECT USER_NAME()
SELECT * FROM sys.objects;
REVERT;

-- Connect with master and re-create the login
USE master;
GO
CREATE LOGIN Jane WITH PASSWORD = 'Pa$$w0rd';

-- Connect again and check
USE Sales;
GO

-- This throws the error again
EXECUTE AS USER = 'Jane';

-- Checking orphaned users
-- This will show that Jane's user account is orphaned.
SELECT d.* 
FROM sys.database_principals d
 LEFT OUTER JOIN sys.server_principals s
  ON d.sid = s.sid
WHERE s.sid IS NULL and d.type = 'S';

-- Map the Jane's user account with her login
ALTER USER Jane WITH LOGIN = Jane;

Saturday, August 6, 2016

The shared dataset definition is not valid. Details: The required attribute 'Name' is missing

Are you experiencing this error when trying to deploy a shared dataset? 


This has been mentioned in several forums and considered as a bug. Searched for a permanent solution but could not find, therefore this is what you have to do if experience the same;

    1. Open the project folder and locate the Shared DataSet.


    2. Open the shared dataset using a Text Editor and add the Name Attribute as below.


    3. Now deploy and see. You should be able to do it.

Wednesday, August 3, 2016

Arithmetic overflow error converting expression to data type int - It works for larger number but not for a smaller number

One of my colleagues sent me a simple code that has two lines. The first line works without any issue but second throws an error. This is the code;

SELECT 9*10000000000000000
SELECT 9*1000000000

This is the output;


What is the issue, the first calculation is much larger to second one but it works. The reason is, how SQL Server evaluates the expression and decides the data types. If we consider the second line, the largest value in the expression is 1,000,000,000 which is lower to the highest value that int can hold. The maximum number int can hold is 2,147,483,647. Since both 9 and 1,000,000,000 fall into the range related int, it sets the data type of the return value has int as well. However, when 1,000,000,000 gets multiplied by 9, it exceeds the range. That is the reason for the error.

With the first line, 10,000,000,000,000,000 is considered as a bigint type number and the result becomes the same. Therefore it works without any issue.

If we write the query as below, it will work without any issue.

DECLARE @a bigint
SET @A = 10000000000000000
SELECT  9*@A

SET @A = 1000000000
SELECT  9*@A

Saturday, July 2, 2016

Connecting to the Integration Services - "Access is denied."

If required permissions are not configured for Integration Services for your account, you will experience following error when trying to connect with Integration Services;

Connecting to the Integration Services service on the computer "" failed with the following error: "Access is denied."


This clearly indicates that you need permission on it. How can I add required permissions for any account?

Here are the steps for giving required permission;
1. Search for Component Services and open it.
2. Expand Component Services -> Computers -> My Computer -> DCOM Config.


3. Find Microsoft SQL Server Integration Services, right click on it and get Properties. Go to Security tab and add accounts you need with relevant permissions for Launch and Activation Permissions, Access Permissions, and Configuration Permissions by clicking Edit button.



Click OK to save all. You need to restart the Integration Services service for getting added permission applied. Once restarted, you will be able to connect with it without any issue.