February 2009

You are currently browsing the monthly archive for February 2009.

Since I participated in the SQL Server 2008 MCTS beta exams last year I wondered how many people would actually do this.  Now I have found part of the answer.

Microsoft keeps a count of all MCPs, grouped by certificate.  This page is was called Number of Microsoft Certified Professionals Worldwide and is located on the Microsoft Learning site. Update: with a layout update to the Learning site it has been relocated to the By the Numbers tab on the Certification Overview.

Obviously, the certificates that I achieved are mentioned on the page and I must say that I’m quite surprised.  These numbers do not reflect how many people took part of the beta exams but at least everyone that passed them since they became available (either beta or regular) is counted here.  I thought that the counts would be in the thousands range but apparently not.

At this moment (numbers are current as of January 2009) there are 2,381,907 MCPs in the whole wide world.  Out of these, 464 people hold the SQL Server 2008, Implementation and Maintenance (70-432) certificate and only 188 passed the SQL Server 2008, Business Intelligence Development and Maintenance (70-448) exam!

Here are some more SQL Server-related certificate counts:

Microsoft credential Certification Worldwide achievers
MCTS SQL Server 2005: Business Intelligence Development 3,119
MCTS SQL Server 2005 47,245
MCTS SQL Server 2008, Database Development 286
MCITP Business Intelligence Developer 1,194
MCITP Business Intelligence Developer 2008 80
MCITP Database Administrator 7,200
MCITP Database Administrator 2008 161
MCITP Database Developer 3,121
MCITP Database Developer 2008 143
MCM Microsoft Certified Master: Microsoft SQL Server 2008 2
MCA Database 18

To be allowed to participate in the MCITP (Microsoft Certified IT Professional) exams you need to hold the MCTS degree on the same subject.

When you have succeeded in both MCITP Database Developer AND MCITP Database Administrator the next step is MCM (Microsoft Certified Master).

And finally, if you still have not satisfied your hunger for certification, you can go for MCA (Microsoft Certified Architect).

Have a look at the Microsoft SQL Server 2008 Certification site if you’re wanting to invest in your future!

Share

Tags: , , ,

Earlier this week a colleague had an interesting question.  He was working on an application that uses a database containing more than 200 tables and wanted to find all columns in the database where this column is null for all records, so in other words all empty columns.  This would give him an idea of possibly unused (i.e. obsolete) columns.

Obviously he did not want to do this manually.

So I wrote him this script:

/*
DESCRIPTION: Returns a list of all columns in current database
             where the column's value is null for all records.
WRITTEN BY:  Valentino Vranken
CREATED:     2009-02-15
VERSION:     1.0

COPIED FROM: http://blog.hoegaerden.be
*/
declare @tempTable table
(
    TableSchema nvarchar(256),
    TableName nvarchar(256),
    ColumnName sysname,
    NotNullCnt bigint
);

declare @sql nvarchar(4000);
declare @tableSchema nvarchar(256);
declare @tableName nvarchar(256);
declare @columnName sysname;
declare @cnt bigint;

declare columnCursor cursor for
    select TABLE_SCHEMA, TABLE_NAME, COLUMN_NAME from INFORMATION_SCHEMA.COLUMNS
    where IS_NULLABLE = 'YES';

open columnCursor;

fetch next from columnCursor into @tableSchema, @tableName, @columnName;

while @@FETCH_STATUS = 0
begin
    -- use dynamic sql to get count of records where column is not null
    set @sql = 'select @cnt = COUNT(*) from [' + @tableSchema + '].[' + @tableName +
        '] where [' + @columnName + '] is not null';
    -- print @sql; --uncomment for debugging
    exec sp_executesql @sql, N'@cnt bigint output', @cnt = @cnt output;

    insert into @tempTable select @tableSchema, @tableName, @columnName, @cnt;

    fetch next from columnCursor into @tableSchema, @tableName, @columnName;
end;

close columnCursor;
deallocate columnCursor;

select * from @tempTable where NotNullCnt = 0;

Yes I know, it uses a cursor and dynamic SQL but hey, it’s not something that will run every minute in a production environment.  And I couldn’t really think of an easy way to do this without these features.

For more info on dynamic SQL I would like to point you to this excellent website by Erland Sommarskog.  I’ve read his articles since several years now and always keep coming back to them to refresh my memory.  The particular article here is called The Curse and Blessings of Dynamic SQL.

And finally here’s a Microsoft Support article that explains how to capture the output of a dynamic query executed by sp_executesql.  According to the article this feature is not documented in the BOL.  I think this is no longer a valid statement, there’s even an example in the BOL page on sp_executesql.

Share

Tags: , , ,

BOL JAN 2009 update

The SQL Server 2008 Books Online have gotten a new update.  This time they are dated January 2009 with version number 10.00.1601.22.  So if you like to have the latest information available under that F1 button in SSMS, you have some downloading to do.

The download is available here: http://www.microsoft.com/downloads/details.aspx?FamilyId=765433F7-0983-4D7A-B628-0A98145BCB97&displaylang=en

In case you’ve missed the small issue related to the BOL that shipped with the RTM version of SQL Server 2008, have a look at this article on Tibor Karaszi’s blog.

Share

Tags: , , ,

Having trouble connecting to your 2005 Integration Services using Management Studio 2008?  Even when you only have SQL Server Integration Services running, and not version 10.0?

Integration Services

To make it even better, have you noticed that SQL Server Integration Services 10.0 will start up when you connect to it from SSMS 2008?

Well, that’s by design.  SSMS 2008 can only connect to SSIS 2008.  And SSMS 2005 can only connect to SSIS 2005.  So don’t go uninstalling the 2005 Management Studio to save some space if you use Integration Services 2005!

There’s some (limited) info on Microsoft Connect and on SQL Server Central.

Update: when trying to use SSMS 2008 to connect to SSRS 2005, an error coming from the Report Server WMI provider will show up complaining about a couple of invalid namespaces (Microsoft.SqlServer.Management.UI.RSClient and System.Management).  It has been reported as a bug on Microsoft Connect.  Microsoft’s feedback is that the issue is caused by a breaking change in the WMI namespace and that it won’t be fixed.

So there you go, neither Reporting Services nor Integration Services 2005 can be connected to using Management Studio 2008.

Share

Tags: , , , , , , , , ,

Sometimes you’re in a situation where your stored procedure needs to insert several new records and some fields are based on calculations, or one of the fields contains a kind of batch number that needs to increase for every record where another certain field contains the same value.

In this situation you may have a habit of building a loop to insert the records, such as the following:

(I’ve used the Person.Address table from the AdventureWorks DB for demonstration purposes – it doesn’t contain too many fields :-) )

declare @counter int;
set @counter = 1;
while @counter <= 10000
begin
    insert into Person.Address (AddressLine1, AddressLine2, City, 

        StateProvinceID, PostalCode, ModifiedDate)
    values ('the first address line', 'second add line', 'some city', 1,
        'PC-0000' + cast(@counter as varchar(10)), getdate());
    set @counter = @counter + 1;
end

Executing the loop above takes 3.5 seconds on my laptop.  This is not very long, but it can be better.

If you think about this, what is going on in those 3.5 seconds?  Not only are we inserting 10000 records, every index in that table is getting 10000 updates.  Besides the clustered index there are 3 additional indexes.  In total this makes 40000 changes that are being processed on the table.  And that’s not all, every insert happens inside a transaction, so 10000 transactions.  SQL Server needs to keep track of what has been modified, so writes to the transaction log are going on as well.

Now, how can we avoid this?  The answer lies in the table variable!  Create a table variable that will hold your new records, loop to create the records in the table variable and then, after the loop, insert all records from the table variable into your actual table.

-- declaration of table var 
declare @table table
(
    [AddressLine1] [nvarchar](60) NOT NULL,
    [AddressLine2] [nvarchar](60) NULL,
    [City] [nvarchar](30) NOT NULL,
    [StateProvinceID] [int] NOT NULL,
    [PostalCode] [nvarchar](15) NOT NULL,
    [ModifiedDate] [datetime] NOT NULL
);
declare @counter int;
set @counter = 1; 

while @counter <= 10000
begin
    -- insert in table var instead of actual table  
   insert into @table (AddressLine1, AddressLine2, City, 

        StateProvinceID, PostalCode, ModifiedDate)
    values ('the first address line', 'second add line', 'some city', 1, 

        'PC-0000' + cast(@counter as varchar(10)), getdate());
    set @counter = @counter + 1;
end 

-- insert in actual table 
insert into Person.Address (AddressLine1, AddressLine2, City, 

    StateProvinceID, PostalCode, ModifiedDate)
select * from @table;

This now executes in 453 milliseconds, isn’t that great?  That’s an improvement of about 700%!

A good reason for this is that we now reduced the number of transactions to 1 because the data is being inserted in one go.  And the changes to the indexes are happening all at once, which should be better as well.

In my opinion, in specific situations this can be an interesting concept to optimize your code performance.

See Books Online for more info on the table variable itself: http://msdn.microsoft.com/en-us/library/ms175010.aspx

Share

Tags: , , ,

© 2008-2017 BI: Beer Intelligence? All Rights Reserved