SQL Server 2012 Reporting Services BlueprintsAs if preparing for SQL Server Days wasn’t sufficient to keep me fully occupied, I was once again involved in the process of book publication…

And no, I didn’t write it – though I had been asked for it many months ago – no, that not-to-be-underestimated task was for Marlon Ribunal (b|t) and Mickey Stuwey (b|t).

My role was Technical Reviewer so I read and commented on all chapters regarding their technical content.  But that’s not all, I also pointed out typos, my drive to perfection can’t just ignore those :)

The book which I reviewed is called SQL Server 2012 Reporting Services Blueprints and is published by Packt Publishing.

So if you’re (relatively) new to SQL Server Reporting Services this should be a good read for you.

The book is mainly aimed at report developers though I’m sure it wouldn’t hurt if administrators had a read through it too.  In fact, a couple of the chapters contain material that should be known by every SQL admin who needs to support BI deployments.

The book starts with the basics and gradually builds up to designing more complex reports.  This gives everyone a good chance to understand the basics before delving into more complex material.  By the end of chapter 6 you’ll have used all components needed for SSRS report development, which is a good thing!

So go out, get a copy and start reading about John Kirkland’s tale!

Happy reading!

Valentino.

Share

Tags:

image

Regular readers of this blog know that I was one of the speakers at our Belgian SQL Server Days event last Thursday and Friday.

I have to admit that I underestimated the amount of work a little: working full time for a customer, preparing a one-hour session, preparing the SQL Server Days (luckily we’ve got a great team at the SQLUG for this part!) and still having some kind of personal life as well.

On day two we had the pleasure of welcoming 279 attendees all ready to attend sessions in our five different tracks.  And about 70 of those people were in my room!!  I had practically no stress before or during the start of my session, except for the moment when I had to get the microphone wire attached to my head somehow (luckily the technician was still around!)…  In fact, I was more stressed for my dry run session three days earlier.

Would I do it again?  Ow yeah, absolutely!  Even though it requires quite some time in preparation, the applause at the end of the session makes up for that! Smile

And now the moment you’ve all been waiting for: my slides and demos are available for download from my Skydrive.

References

For those seeking more information, here are some references related to my session content:

The official Data Quality Services blog (a lot of useful content on DQS!!)

SSIS DQS Matching Transformation

SSIS DQS Domain Value Import

Creating a Synchronous Transformation with the Script Component

Fuzzy Lookup Transformation

DQS Cleansing Transformation

Finding Similar Strings With Fuzzy Logic Functions Built Into MDS

Have fun, and see you next time!

Valentino.

Share

Tags: , ,

SQL Server Days 2013For the first time ever am I not only participating in the organization of the SQL Server Days, I’ll be speaking as well!  Exciting times!

In my session, Cleaning up the mess with SSIS and DQS, you’ll learn a couple of tricks on dealing with dirty data.

Here’s the abstract for my session:

Are you loading data from exotic sources such as Excel and flat file?

Or are you dealing with manually-entered data through an application that allows, well, practically anything?  And do you sometimes run into trouble because the incoming data is not as expected?

Then you should really join us in this session in which I’ll demonstrate (yes, demos!) several different techniques that can be used to cleanse your dirty data!

Check out the agenda for more details!

We’ve got speakers from all over the world: South-Africa, Italy, Slovenia, UK, US and of course… Belgium!

Not registered yet?  Do it now!

Ow, and did you know we’ve introduced a special feature?  It’s called Bring a colleague for free.  If you register you can register an additional person for free, as long as there aren’t 50 of those free ones registered that is so be fast!

Bring your colleague

In the meantime: have fun and see you there!

Valentino.

Share

Tags: , , , , ,

A Record SetAs you may already know, it is possible to use the Execute SQL Task to populate a package variable with a result set.

In case you’re not that familiar with this technique yet, here are a quick two words on setting that up.  You just give it a query, set the ResultSet property to Full result set and configure a package variable in the Result Set property window.  The package variable’s type is System.Object.

But what exactly is this mysterious System.Object and how can we explore it?  Well, that depends.  More precisely, it depends on the Connection Type which you’ve chosen in the Execute SQL Task properties.

Let’s explore two possibilities: ADO.NET and OLE DB.  Our end goal is straightforward: retrieve the number of records in the result set.

The query which I’m using in the Execute SQL task is this one:

select ProductAlternateKey
from dbo.DimProduct
where Color = 'blue'

On my AdventureWorksDW2012 database it should return 28 records: 28

Exploring the ADO.NET result set

The first step is finding out what type exactly this result set object is.  Hook up a Script Task to your Execute SQL task and put a breakpoint on it.  Now run your package and examine the Locals window:

Debugging the Control Flow to find the object type

Well look at that, it’s a System.Data.DataSet!  Using this knowledge it’s fairly simple to produce code that fetches the record count:

DataSet ds = (DataSet)Dts.Variables["MyResultset"].Value;
MessageBox.Show(ds.Tables[0].Rows.Count.ToString());

Note: don’t forget to add the package variable to the ReadOnlyVariables before opening the code editor.

The System.Data namespace is included by default in the using statements, no worries there.  So we can just cast the variable into a Dataset.  The DataSet object contains a DataTableCollection called Tables.  As there’s only one result set this is located at index zero.  We travel down the object tree to finally find the Count property of the Rows DataRowCollection.

And here’s the result:

The message box shows 28 items

That’s all there’s to it, easy huh?  Let’s move on to our second option, OLE DB.

Exploring the OLE DB result set

Once again we start at the beginning: with the debugging of the Control Flow to find out what object type our mysterious System.Object is:

The OLE DB result set gives us a System.__ComObject, hmm...

Hmm, System.__ComObject, that’s … special.  Ow right, the OLE DB provider uses a COM wrapper.  How can we “unwrap” our object and introduce it to the .NET world?  Let’s see if we can find out what’s hidden behind that wrapper, by using the following code:

MessageBox.Show(Microsoft.VisualBasic.Information.TypeName(Dts.Variables["MyResultset"].Value));

TypeName is a VB.NET function and retrieves the data type of the parameter passed into it.

To get this to run in a C# SSIS task you first need to add the Microsoft.VisualBasic reference:

Adding a reference to the VB.NET assembly

Executing the package results in this:

Result type: Recordset

So, our result is Recordset, hmm, well, I think we more or less knew this already.  What kind of Recordset?  Well, an ADO Recordset.  We know this because the following code works:

System.Data.OleDb.OleDbDataAdapter da = new System.Data.OleDb.OleDbDataAdapter();
DataTable dt = new DataTable();
da.Fill(dt, Dts.Variables["MyResultset"].Value);
MessageBox.Show(dt.Rows.Count.ToString());

Basically, we use the Fill method of the OleDbDataAdapter to fill a System.Data.DataTable with the data from the ADO Recordset.  The version of the method in our example (there are several overrides) accepts two parameters:

public int Fill(

DataTable dataTable,

Object ADODBRecordSet

)

With the DataTable filled we’ve got once again access to a Rows DataRowsCollection, exactly the same as in our ADO.NET example in fact.  Executing the package now results in exactly the same message box as shown earlier: 28 records!

Beware of pitfalls

If you mix the two methods up you’ll get funky errors such as:

System.InvalidCastException: Unable to cast COM object of type ‘System.__ComObject’ to class type ‘System.Data.DataSet’. Instances of types that represent COM components cannot be cast to types that do not represent COM components; however they can be cast to interfaces as long as the underlying COM component supports QueryInterface calls for the IID of the interface.

and also

System.ArgumentException: Object is not an ADODB.RecordSet or an ADODB.Record.

So be careful, use the right object types for your particular System.Object.

Conclusion

In this article I’ve demonstrated a couple of methods which can be used to retrieve information from the mysterious System.Object result set in an SSIS package.

Have fun!

Valentino.

Additional References

Execute SQL Task

Result Sets in the Execute SQL Task

OleDbDataAdapter Class

Share

Tags: , ,

T-SQL Tuesday #46 Rube Goldberg MachineHere’s my participation in this month’s T-SQL Tuesday.  T-what you say?  T-SQL Tuesday, the monthly blog party started by Adam Machanic!  Head over here to this month’s invitation by Rick Krueger: T-SQL Tuesday #46 Rube Goldberg Machine for the longer story.

So, a story about SQL Server-related contraptions, let’s see…  Yeah, I can do that!

Some years ago I was asked to help out in finding a solution to a certain problem, the life of a consultant, right?  The problem was performance related.  No, this is not a story about a database without any indexes.  Instead, it involves a certain .NET web application.  The end user would fill out a page with some metrics and the application would then perform certain calculations.  And that’s where we get to the issue: the calculations would take way too many minutes (yes, not seconds, minutes!) to finish.  That was the challenge.

The first step was to examine the .NET code that was performing those calculations.  But there were so many lines of code, complicated code, that the final conclusion was: hmm, rewriting this contraption is not an option right now…

After some discussion with the project lead it became clear that the end user would not necessarily need to wait for the calculation to finish.  It didn’t need to get calculated online.  Aha!  So if we would be able to schedule these calculations and set up a batch process then that means problem solved, right?  Now, how would we schedule this?  How about our favorite scheduler, the SQL Server Agent?  Yes, that’s it!

So I ended up building an assembly that was to be called from SQL Server: a CLR-enabled assembly!  The web application used web services so the new assembly would also call a couple of (reworked) "calculate" methods.  This meant we had to install the assembly with increased permissions: EXTERNAL_ACCESS to be precise.

Furthermore I didn’t want to hardcode the path to the web service.  There were different environments, you know, DEV, UAT, PRD, so it had to be configurable.  Now, as the assembly is getting called from SQL Server, I needed to provide the configuration through sqlservr.exe.config which gets loaded when SQL Server starts.

As you can probably guess, this little project required a lot of investigation, trial and error, and so on.  But I learned a lot from it and that’s what I like!  And it also made me write the following article: Calling A Web Service From SQL Server 2005.

This article turned out to be quite popular, even today, almost five years later, it’s still in my top 10 most viewed pages of last month!

Most popular pages over one month

Was this a clean solution?  Well no, otherwise I wouldn’t have selected it for this month’s T-SQL Tuesday party, right?

Did it solve the issue?  It sure did!

‘till next time and in the meantime: have fun!

Valentino.

Share

Tags:

« Older entries § Newer entries »

© 2008-2017 BI: Beer Intelligence? All Rights Reserved