Loading Complex XML Using SSIS
In my previous article I showed you how the XML Source component can be used to load XML files into a SQL Server database, using fairly simple XML structures. In this follow-up article I will demonstrate how to tackle the complex XML issue.
The Complex XML Example
You probably know that SSRS reports, RDLs, are actually XML files. And they’re not the easiest types of XML files around. To humans they are still readable but the structure can be quite complex. So there we’ve got our example: an RDL. More specifically I’ll be using the RDL that’s available for download in one of my earlier articles.
Every good example has got a goal. Our goal today is to retrieve a list of datasets and fields as defined in the RDL. Shouldn’t be too difficult, right?
Using The XML Source Component
Let’s try to get this done through the XML Source component with which we’re very familiar by now. You know the drill: drag an XML Source into your Data Flow, open it up and configure the XML and XSD locations.
Note: to be able to do this I cheated a bit by manually manipulating the RDL a little. More precisely I removed all the namespace references from the <report> tag and further down the XML (removed “rd:”).
With both files configured, let’s have a look at the Columns page:
Look at that massive list of output flows! In total I’ve gotten 45 of them, all for free! Even if you’re up to the task of creating 45 output tables, do you really want to find out how to get these joined together? To prevent creating that bunch of tables you may consider using the Merge Join component… 45 times in your data flow. Didn’t think so!
Sure, it would run fine if you manage to get it all constructed. But in my opinion this is just too silly to try out because there’s an interesting alternative.
And that alternative is XSLT – eXtensible Stylesheet Language Transformations.
With XSLT you describe what you want to retrieve from the XML document and what it should look like. In this example we’ll be retrieving the list of datasets and their fields, in CSV format. CSV stands for Comma-Separated Values, although I prefer the term “Character-Separated Values” as the separator is not always a comma.
To be able to write correct XSLT, you need to know what the XML structure looks like. Here are the first 31 lines of the sample RDL file mentioned earlier.
<?xml version="1.0" encoding="utf-8"?> <Report> <AutoRefresh>0</AutoRefresh> <InitialPageName>A Very Unique Name</InitialPageName> <DataSources> <DataSource Name="srcContosoDW"> <DataSourceReference>ContosoDW</DataSourceReference> <SecurityType>None</SecurityType> <DataSourceID>b7a3d32c-e95d-4acf-bb99-9d60755303ea</DataSourceID> </DataSource> </DataSources> <DataSets> <DataSet Name="dsProductList"> <Query> <DataSourceName>srcContosoDW</DataSourceName> <CommandText>select DPC.ProductCategoryName, DPS.ProductSubcategoryName, DP.ProductName from dbo.DimProduct DP inner join dbo.DimProductSubcategory DPS on DPS.ProductSubcategoryKey = DP.ProductSubcategoryKey inner join dbo.DimProductCategory DPC on DPC.ProductCategoryKey = DPS.ProductCategoryKey;</CommandText> </Query> <Fields> <Field Name="ProductCategoryName"> <DataField>ProductCategoryName</DataField> <TypeName>System.String</TypeName> </Field> <Field Name="ProductSubcategoryName"> <DataField>ProductSubcategoryName</DataField> <TypeName>System.String</TypeName> </Field>
As you can see, the main node is called Report. Nested under Report we’ve got DataSets, which can have several DataSet elements. Each DataSet has a set of Fields with one or more Field elements. Using that information we come to the following XSLT.
<?xml version="1.0" encoding="utf-8"?> <xsl:stylesheet version="2.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:fn="http://www.w3.org/2005/xpath-functions"> <xsl:output method="text" version="1.0" encoding="UTF-8" indent="no"/> <xsl:template match="/"> <xsl:text>DataSource;DataSet;Field</xsl:text> <xsl:text> </xsl:text> <xsl:for-each select="Report/DataSets/DataSet/Fields/Field"> <xsl:text>"</xsl:text> <xsl:value-of select="../../Query/DataSourceName"/> <xsl:text>";"</xsl:text> <xsl:value-of select="../../@Name"/> <xsl:text>";"</xsl:text> <xsl:value-of select="@Name"/> <xsl:text>"</xsl:text> <xsl:text> </xsl:text> </xsl:for-each> </xsl:template> </xsl:stylesheet>
So, what is the XSLT describing? On line three, we say that the output should be text in UTF-8 encoding. The “template match” on the fourth line takes the whole XML document into consideration, hence the forward slash. Then on line five we start writing output through the xsl:text tag. This is our header line. As you can see we’re using the semi-colon as column separator in the CSV output. Line six adds a CRLF (carriage-return + line feed) to the output.
Then the fun part starts. If you have experience with XPath, the way XSLT walks through the XML document should look familiar to you.
The xsl:for-each tag loops over all the Fields in all the DataSets in the document.
Using the xsl:value-of tag, we can fetch values out of the XML. The first value being retrieved is the name of the data source that dataset is using. (I’ve added the retrieval of the data source to demonstrate how element values are retrieved.) The path to the DataSourceName element is Report/DataSets/DataSet/Query/ so we use the double-dot syntax to navigate two levels up in the XML tree. The value of the element itself is retrieved by just using its name, as demonstrated in the XSLT above.
The next value-of tag retrieves the Name attribute of the DataSet, hence the two levels up, and the final value-of fetches the Name attribute of the Field element.
Now that the XSLT is clear for everyone, how do we apply it to our XML document? Here comes the time for SSIS once more!
Open up the BIDS with the Control Flow of an SSIS package active and throw in an XML Task component.
Double-click the component to open up the XML Task Editor. This is what it looks like by default:
As this is an all-round XML task that can handle several XML-related tasks, the first setting that we need to modify is called OperationType. That’s not too complicated because it comes with a dropdown and XSLT is one of the possible values.
With XSLT selected, the editor transforms into the following:
Now we need to configure where the task can find our XML file, through the Source property. Click the Source textbox to make the dropdown appear and select <New File connection…>.
In the File Connection Manager Editor, leave the Usage type at Existing file and select the RDL.
Next up we’re going to specify where the task can find the XSLT that needs to be applied to the XML. That can be done through the Second Operand settings. As SecondOperandType, select File Connection. Use the dropdown of the SecondOperand property to create a second new file connection that points to your XSLT file.
With that set up as well, only one step remains. The task still doesn’t know where the output should be saved. Or that it actually should get saved. So first switch the SaveOperationResult property to True. As you can see, DestinationType is already set to File Connection, that’s what we need. Use the dropdown of the Destination property to create a third new file connection. This time however, Usage Type should be set to Create File. Specify path and filename for the output file and click OK to close the File Connection Manager Editor.
This is what our XML Task now looks like in the editor:
As shown above, I’ve called the output file DatasetInfo.csv.
One more property that can be interesting is the OverwriteDestination property. Setting it to True can ease the testing of your package if you need to execute it multiple times. Which you’ll probably want when your XSLT is not giving the expected output. Don’t forget to set it to False afterwards (depending on what behavior you actually expect from your package).
Okay, now close the XML Task Editor and execute the package. If you haven’t made any mistakes, the task should color green and you should have an extra file on your hard drive somewhere. Here’s what the content of my DatasetInfo.csv looks like:
Look at that, a list of fields, all part of the dsProductList dataset.
“Hang on, wasn’t this article going to demonstrate how to get complex XML files imported into our database? And now you’re writing the data to a file?!”
Well yeah, you’re right. Unfortunately the XML Task does not offer the possibility to write to a table in a database. So to get the data imported into your database you’ll need to set up a Data Flow that imports the CSV files. But that shouldn’t be too difficult to achieve, right?
With this article I have shown how Integration Services can be used to retrieve data out of complex XML files, without actually using the XML Source component. I hope you’ve enjoyed reading it as much as I had while writing. Or maybe you know another interesting method to get complex XML imported. Feel free to post comments!
If, after reading and applying the above technique, you are struggling with getting special characters such as é, è or ö and even ô, imported make sure to read my follow-up article on SSIS, Flat Files and Accents. It also gives some more insight into what the above method actually produces (code page UTF-8 is a hint).
- Book: SQL Server 2012 Reporting Services Blueprints
- SQL Server Days 2013 – Data Cleansing: Download
- SQL Server Days 2013
- Exploring the System.Object Package Variable [SSIS]
- T-SQL Tuesday 46: Contraptions!
- Creating Multi-Column Reports: The Top-Down Version [SSRS]
- Formatting Dates [SSRS]
- Community Day – Data Visualization Tips & Tricks: Download
- Formatting Numbers [SSRS]
- Community Day 2013: Data Visualization Tips & Tricks
- November 2013 (2)
- October 2013 (1)
- September 2013 (2)
- July 2013 (2)
- June 2013 (2)
- May 2013 (3)
- March 2013 (3)
- February 2013 (2)
- January 2013 (2)
- December 2012 (2)
- November 2012 (3)
- October 2012 (2)
- August 2012 (2)
- July 2012 (2)
- June 2012 (2)
- May 2012 (2)
- April 2012 (3)
- March 2012 (4)
- February 2012 (4)
- January 2012 (2)
- December 2011 (2)
- November 2011 (2)
- October 2011 (1)
- September 2011 (3)
- August 2011 (2)
- June 2011 (2)
- May 2011 (3)
- April 2011 (3)
- March 2011 (3)
- February 2011 (2)
- January 2011 (5)
- December 2010 (1)
- November 2010 (3)
- October 2010 (3)
- September 2010 (2)
- August 2010 (4)
- July 2010 (2)
- June 2010 (4)
- May 2010 (6)
- April 2010 (3)
- March 2010 (3)
- February 2010 (11)
- January 2010 (9)
- December 2009 (2)
- November 2009 (3)
- October 2009 (3)
- September 2009 (4)
- August 2009 (6)
- July 2009 (2)
- June 2009 (3)
- May 2009 (7)
- April 2009 (3)
- March 2009 (3)
- February 2009 (5)
- January 2009 (4)
- December 2008 (2)
- November 2008 (3)
- October 2008 (1)
- September 2008 (1)
- August 2008 (4)
- July 2008 (3)