Saturday, April 26, 2025

Case Sensitivity in XML Extraction

I recently encountered an interesting issue where XML elements had similar names but differed in character capitalization.

Lets show it via an example:

At first, will declare an XML variable and assign the simplest XML code to it:

DECLARE @xml XML = '
<ROOT>
    <ELEMENT NAME="ElementName" VALUE="PascalCase" />
    <ELEMENT NAME="ELEMENTNAME" VALUE="UPPERCASE" />
    <ELEMENT NAME="elementname" VALUE="LOWERCASE" />
</ROOT>';

SELECT @xml;

Now will try to extract from that XML element with a name "ElementName":
DECLARE @RequestElement SYSNAME = 'ElementName';

SELECT [Case] = 'Not case sensitive'
    , ElementName = C.value('@NAME', 'VARCHAR(100)')
    , ElementValue = C.value('@VALUE', 'VARCHAR(100)')
FROM @xml.nodes('//ELEMENT') AS X(C)
WHERE C.value('@NAME', 'VARCHAR(100)') = @RequestElement;
That query returned all three elements, while I expected only a single one:

That happened because SQL Server is not case sensitive by default.
In order to return the record I need I may switch the whole database I'm working in to Case Sensitive Collation or just simply add a Case Sensitive collation to my WHERE clause like this:
DECLARE @RequestElement SYSNAME = 'ElementName';

SELECT [Case] = 'Case sensitive using Collation'
    , ElementName = C.value('@NAME', 'VARCHAR(100)')
    , ElementValue = C.value('@VALUE', 'VARCHAR(100)')
FROM @xml.nodes('//ELEMENT') AS X(C)
WHERE C.value('@NAME', 'VARCHAR(100)') = @RequestElement COLLATE Latin1_General_CS_AS;

That code returned me the only a single line I want:


Do you think the case is closed?
Not at all — there’s an alternative solution that may be even more elegant:
DECLARE @RequestElement SYSNAME = 'ElementName';

SELECT [Case] = 'Case sensitive via parameter'
    , ElementName = C.value('@NAME', 'VARCHAR(100)')
    , ElementValue = C.value('@VALUE', 'VARCHAR(100)')
FROM @xml.nodes('//ELEMENT[@NAME=sql:variable("@RequestElement")]') AS X(C)
It also returns a single row, but without the need to worry about collation:


In this final example I've used my SQL variable "@RequestElement" as a parameter inside of an XML query, which clearly demonstrates the differentially between use of "WHERE" clause and XML node parameter:

- Filtering in the "WHERE" clause depends on SQL Server database collation, but an XML query is always case sensitive.



Friday, February 7, 2025

SSIS: REPLACENULL does not support (DT_DBTIMESTAMP2,7)

 Using the "REPLACENULL" functionality frequently in the "Derived Column" component, the "Conditional Split" component, and other places in SSIS where formulas can be applied is common.

However, I recently encountered an issue with the "DT_DBTIMESTAMP2" data type.

The following formula produced an error:

REPLACENULL(TestDt, (DT_DBTIMESTAMP2,7)"1900-01-01 00:00:00.0000000")

Error: 0xC020902A at Test Transformation, Derived Column [2]: The "Derived Column" failed because truncation occurred, and the truncation row disposition on "Derived Column.Outputs[Derived Column Output].Columns[TestDt]" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.

This error occurs because "REPLACENULL" returns a "DT_WSTR" data type. To make it work, we need to convert "DT_DBTIMESTAMP2" to "DT_WSTR" and then convert it back to "DT_DBTIMESTAMP2", like this:

(DT_DBTIMESTAMP2,7)REPLACENULL((DT_WSTR,30)TestDt,"1900-01-01 00:00:00")


Alternatively, a more elegant solution is to replace "REPLACENULL" with an "IF" condition:

ISNULL(TestDt)?(DT_DBTIMESTAMP2,7)"1900-01-01 00:00:00":TestDt




Thursday, January 30, 2025

SSIS warning message about SSAS Cube processing: "Operation completed with XXX problems logged"

If you process SSAS cubes via SSIS packages you might notice a weird message like "Full Processing:Warning: Server: Operation completed with XXX problems logged."

How you can get that message (if you have that problem):

1. You can do a report, from your package's execution and get something like this:

SSIS processing SSAS Cube Warning message


2. You can run a T-SQL script against your SSIS server:

SELECT TOP 100 message_time, message
FROM [SSISDB].[internal].[operation_messages]
WHERE message_type = 110 AND message_source_type = 40
   AND message LIKE '%Warning: Server: Operation completed with%problems logged.'
ORDER BY message_time DESC;

If you have that problem you might have something like this:


The Problem.

1. SSIS Server does not provide you any details on that warning nor any associated problem.
2. SSAS Server also does not report any problems associated with that Cube processing.

Solution.

You can use "Extended Events" to capture these problems:

1. For that, you have to create an Extended Event Session using following script on your SSAS Server

<Create xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
  <ObjectDefinition>
    <Trace>
      <ID>SSAS_CubeProcessing_Stream</ID>
      <Name>SSAS_CubeProcessing_Stream</Name>
      <XEvent xmlns="http://schemas.microsoft.com/analysisservices/2011/engine/300/300">
        <event_session name="SSAS_CubeProcessing_Stream" dispatchLatency="0" maxEventSize="0" maxMemory="4096" memoryPartition="none" eventRetentionMode="AllowSingleEventLoss" trackCausality="true" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
          <event package="AS" name="ProgressReportError" />
          <target package="package0" name="event_stream" />
        </event_session>
      </XEvent>
    </Trace>
  </ObjectDefinition>
</Create>

2. Then you run "Watch Live Data" for that session

3. Run your SSIS Cube processing package and monitor the events.

As the result you'll get something like this:


A Problem description you can find in an Event's details under "TextData" name:


At the end, do not forget to delete your Extended Events monitoring Session.

Monday, May 13, 2024

Caveats of using an Expression for SQL Script in SSRS.

Why: Most of the time, when you want a flexibility of your SQL query you can use parameterization. However there might be a situation when you'd need to build a dynamic query. In my case I used SQL query within an expression to feed it to multiple data sources targeting different servers with the exact same query.

DataSet creation: Creation of a simple dataset.

I've created a sample dataset with a sample query:

-- That is a sample query
select top 10 * 
from sys.messages


After we created the dataset we can create a Tablix in our report and test it:


Creation of an Expression

1. Create a new parameter called "SQLExpression"


2. Go to the "Default Values" tab select "Specify values" and choose "fx" box

3. Specify Expression value:

Assign Expression to the DataSet: Return to the Sample DataSet and Specify newly created parameter as the source for DataSet Expression.

Within the expression replace the query by the parameter:
Save changes and try your report.
If  you've done exactly like I did you get an empty report.

Troubleshooting: If you specified your parameter as "Visible" you might notice that report processor aggregated all 3 rows of our query into the one, making it a single comment


Other cases: My case was very easy, but in the most of the cases you'll usually get a weird error message for your query, while the query itself runs fine in SSMS.

Lessons Learned: When we use SQL Query Expressions we have to follow these rules:
1. Use only DOUBLE comment or do not use comments at all.
2. Put at least one space or a tabulation before the very first symbol on each line.
3. Use semicolon symbol to separate multiple SQL instructions.
4. Do not use "GO" command.

The Fix: Change the default value for "SQLExpression" parameter to following query: 

/* That is the Fixed SQL Query
 refurbished for SSRS expression usage*/

 DECLARE @MessageID INT = 101;
 SELECT *
 FROM sys.messages
 WHERE message_id = @MessageID;

Save the parameter and re-run the report.

Enjoy the result of your SQL Query Expression:

Conclusion: Use of SQL Expressions is very easy if you fallow those 4 rules. 

Please let me know if you hit any other situation in question and I'll add it for others to avoid.



Sunday, February 7, 2021

Monitoring Analysis Server (SSAS) using Spotlight (Quest Software)

 This post is just answering two simple questions:

1. Can Quest Software's Spotlight successfully monitor SQL Server Analysis Server?

2. If it can, what SSAS parameters, databases' and cubes' details it monitors and provides information about?


To answer these question I'd have to download, install and configure Spotlight software.

I've downloaded the trial version from official Quest Software web site: 

https://www.quest.com/products/spotlight-on-sql-server-enterprise/


So, the answer for the first question is "Yes".

The newest 13.3 version of the application monitors not only Windows box and SQL Server, but  Analysis Services as well!


After fully configured on my local machine it looks like this


I've configured it to monitor my local SQL Server, Windows box and Analysis Services Server.
I'm not really interested in Windows or SQL Server and click to see "Analysis Services"

It is common Quest Software design and I do not have to learn on how to use it.
The screen provides following metric categories:
  • Connections
  • Memory
  • Storage
  • CPU
  • Server
When I click on some metrics it opens a menu, to see the details
For instance when I click on connections I get following drop down menu:

When I click on "Eye" sign I can see the list of current connections with all their parameters:

From there I can easily switch tab to current "Sessions"

Or just to list of most recent commands. If I select a command I can see the most recent query for a connection:

Memory section allows me to see general memory usage, caching metrics and threads' stats:

Storage section provides me information about my SSAS databases.
In this case I have only one AdventureWorks DB:

And from that screen I can easily switch to "Cubes"

CPU information is reporting only general box metrics of the processor like CPU Utilization, number of "Interrupts", "Kilobytes Transferred" and "Server Work Queues":

When I switched to "Processing" tab I could see other box metrics, such as Queue Length, Threads and Processes count, and Context Switching:

The "Server" section is least informative. It provides me only my server version and edition and status of Disk and Memory:


Besides of all these semi-static data there are supposed to be provided immediate data from my SSAS:
  • Queries Answered/s
  • Queries Requested/s
  • Queries From Cache Direct/s
  • Queries From File/s
  • Temp File Rows Written/s
  • Rows Read/s
  • Rows Written/s
  • Direct Hit Ratio
  • Current Latch waits
  • Current Lock waits
  • Query Pool Job Queue Length
  • Etc.

Unfortunately I have only one databases and couldn't generate a significant activity to rise those numbers above zero level.


Hope my info helped you to answer similar questions as mine.

And stay tuned, I plan to hit other SSAS monitoring tools and go beyond of it.

Monday, July 27, 2020

SSRS. Use of Lookup type functions.

There is a lot of blog posts in the Internet about how to use Lookup and other functions of this type in SSRS.
It is very straight forward and easy to use.
However, I've managed to make a mistake and I assume have the same problem if you got to that page.

The Error

At firs, here is the error I've got when I tried to use Lookup function in SSRS:
The expression used for the calculated field 'ProductName' includes an aggregate, RowNumber, RunningValue, Previous or lookup function. Aggregate, RowNumber, RunningValue, Previous and lookup functions cannot be used in calculated field expressions.

How did I get there?

At first, I've created two data sets "WorkOrders" and "Products"

Then I wanted to include "Product Name" into my "Work Order Report"
Here is How I've done it:

The WRONG WAY

(that is how it shouldn't been done)
Obviously, if I use a function I thought I have to use it in a calculated field

But That is Wrong

You should do it the right way:

The RIGHT WAY

That is not obvious, but you have to use "Query Field" to use the Lookup function:

Then you just have to specify New Field's name and insert a formula.
In my case I used following:

=Lookup(Fields!ProductID.Value, Fields!ProductID.Value, Fields!Name.Value, "Products")


That worked very well for me and I hope you will struggle less by reading this.
Good Luck


Wednesday, February 12, 2020

Use of Uniqueidentifier in Persisted calculated column.

This post is for you in case you decide to use Uniqueidentifier column in your table and then you think about including it into a Persisted calculated column.
You also might see it useful if you like weird or funny SQL Server behavior.

At first, here is the SQL Server version I have that problem on.
After a while Microsoft might fix that bug.
Microsoft SQL Server 2016 (SP2-CU11) (KB4527378) - 13.0.5598.27 (X64) 

To demonstrate the problem in the most simple way you can run following script:
DROP TABLE IF EXISTS #tbl_Error_Test;
GO
CREATE TABLE #tbl_Error_Test(
       GuidOne UNIQUEIDENTIFIER,
       GuidTwo as CAST(GuidOne as CHAR(36)) PERSISTED
  );
GO
INSERT INTO #tbl_Error_Test(GuidOne) VALUES (NewID()), (NewID());
GO
SELECT * FROM #tbl_Error_Test;

GO

It will return something like this:

As you can see, columns GuidOne and GuidTwo are different.
Moreover, if you run following command you will get very unpleasant error:
DBCC CHECKTABLE('#tbl_Error_Test') WITH NO_INFOMSGS, EXTENDED_LOGICAL_CHECKS;

Msg 2537, Level 16, State 106, Line 152
Table error: object ID -1485216446, index ID 0, partition ID 6052840780930088960, alloc unit ID 2738196559872000000 (type In-row data), page (4:1079768), row 0. The record check (valid computed column) failed. The values are 2 and 0.
Msg 2537, Level 16, State 106, Line 152
Table error: object ID -1485216446, index ID 0, partition ID 6052840780930088960, alloc unit ID 2738196559872000000 (type In-row data), page (4:1079768), row 1. The record check (valid computed column) failed. The values are 2 and 0.
CHECKTABLE found 0 allocation errors and 2 consistency errors in table '#tbl_Error_Test_____________________________________________________________________________________________________000000001926' (object ID -1485216446).

repair_allow_data_loss is the minimum repair level for the errors found by DBCC CHECKTABLE (tempdb.dbo.#tbl_Error_Test_____________________________________________________________________________________________________000000001926).

If you try to check what the correct value should be, it won't make any good:
SELECT GuidOne, GuidThree = CAST(GuidOne as CHAR(36))
FROM #tbl_Error_Test;


The SQL Server is smart enough to recognize same function and instead of calculating the value it will return you the value stored in the table.

You can only do a trick if you change the function and convert Guid to 37 chars, instead of 36:
SELECT GuidOne, GuidFour = CAST(GuidOne as CHAR(37))

FROM #tbl_Error_Test;

Now you got the right result, but how to fix it in the table?

It is not easy, but possible. You just have to replace Guid column by itself, but from another table:
UPDATE G1 SET GuidOne = G2.GuidOne
FROM #tbl_Error_Test as G1
INNER JOIN #tbl_Error_Test as G2
       ON G1.GuidOne = G2.GuidOne;
GO
SELECT * FROM #tbl_Error_Test;

GO


It is fixed now and it also fixes DBCC CHECKDB error.

If you see the same behavior on your SQL Server you can vote for that bug to be fixed at Microsoft site:
https://feedback.azure.com/forums/908035-sql-server/suggestions/39694663-use-of-guid-column-in-persisted-calculated-column