April 2006 - Posts

Service Pack 1 Released!

Service Pack 1 is here and it fixes loads of items. Some of which were quite embarassing when training or developing a solution with a client standing behind you. The largest enhancement is the addition of database mirroring, which allows you add high availability to your database by sending transactions to another server in real-time. There are loads of SSIS fixes and enhancements. Add a quick rolled up level, here are the SSIS fixes:

Usability of the Import/Export Wizard has been improved in multiple-table scenarios.
The IDtsPipelineEnvironmentService service lets custom data flow components have programmatic access to the parent Data Flow task.
Interoperability with Analysis Services has been improved. The DataReader source in SSIS supports the System.Object data type by converting columns that have this data type to the DT_NTEXT ssISnoversion data type. To change the data type to a type that is more appropriate for your data, you can add a Data Conversion transformation.
Performance has been improved for many transformations, such as the Sort transformation.
Designer usability has been improved. For example, you can now right-click Dataflow and then click Execute Task to execute only the Dataflow task. You do not have to switch to the control flow to execute only the Dataflow task.
The Expression Builder dialog box now has a public and documented API. This increased access to expression-related objects will benefit task developers because expressions are important to the way that tasks are interrelated.
We have added a registry-based policy for handling SSIS package signatures as a package is executed. For example, administrators can disable loading of unsigned packages and untrusted packages. Because these settings are in the registry, administrators can propagate these settings in the domain by using Microsoft Windows Group Policy policy settings.
In the Advanced Editor dialog box, the Flat File source has the new property UseBinaryFormat. This property supports loading packed decimal data into the pipeline for processing by a script or by a custom transformation.
In the Advanced Editor dialog box, the DataReader source has the new property CommandTimeout. You can use this property to modify the time-out period to allow for long-running operations.
To create or to modify the property expressions of variables, you can now open the Expression Builder dialog box from the Properties window.
You can now add annotations to precedence constraints.


It's certainly a must download for anyone using 2005.

-- Brian Knight

Using a Script Transform in SSIS

In SSIS, the data you receive from the source is rarely in the format your end users would like to see. To correct this, you can use the Script Component to transform data. In this demo, Brian shows you how to use the Script transform to make a column proper case in the data flow (free registration required).


-- Brian Knight

Using Checkpoints in SSIS

SQL Server Integration Services checkpoints enable you to start a package from where it failed. This helps you avoid having to re-run a 6 hour package over again. Learn how to use this valuable feature with this quick how-to video (registration required).


-- Brian Knight


Parsing the OnPipelineRowsSent

The OnPipelineRowsSent event functions in the data flow to log how many rows go between step to step in the pipeline (or from input to output). Once you have a data flow task, you can right-click in the design pane and select Logging. Configure the logging provider and now notice in detail you have OnPipelineRowsSent. This will now create a record for each transform (source or destination) that the data goes through to show how many rows were sent through the pipeline. It writes this though in a single message column pipe delimitted. So, you'll need to write some sort of view or routine to parse out the critical data you need. Here's the query that's from one of the SSIS report packs to parsse this into a readable format that I use:

SELECT     source, sourceid, executionid, ssrs.ParsePipeline(message, 1) AS PathID, ssrs.ParsePipeline(message, 2) AS PathIDName,
                      ssrs.ParsePipeline(message, 3) AS ComponentID, ssrs.ParsePipeline(message, 4) AS ComponentIDName, ssrs.ParsePipeline(message, 5) AS InputID,
                      ssrs.ParsePipeline(message, 6) AS InputIDName, CONVERT(int, ssrs.ParsePipeline(message, 7)) AS RowsSent, starttime
FROM         dbo.sysdtslog90
WHERE     (event LIKE 'onpipelinerowssent%')

Here's the function needed to do the parsing:

--! CREATE FUNCTION (for OnPipeLineRowsSent report)
-- function to parse OnpipelineRowsSent log entry from SSIS sysdtslog90 table
-- Function parses one log entry at a time so a view is also needed (below)
-- to call\use the function for each log entry.

CREATE function [ssrs].[ParsePipeline] (@message varchar(8000), @which int)

--should be created in the DB with desired sysdtslog90 table

returns varchar(200)

as begin

--@which defines which value is desired

-- 1= PathID
-- 2= declare PathIDName
-- 3= declare ComponentID
-- 4= declare ComponentIDName
-- 5= declare InputID
-- 6= declare InputIDName
-- 7= declare rowssent
declare @sourcemessage varchar(600)
declare @where as integer
declare @mycounter integer
If @which < 1 or @which > 7 return null
set @mycounter=0

--catch older versions of the messages that lacked the extra parameters

if patindex('%: :%', @message) = 0 return null

--chop the initial wordy stuff out
set @sourcemessage = right(@message, len(@message) - patindex('%: :%', @message) - 3)

--loop through occurances of : until we get to the desired one
set @where = 99
while @where <> 0 begin
set @mycounter = @mycounter+1
set @where = patindex('%:%',@sourcemessage)
If @mycounter = 7 return @sourcemessage
if @mycounter = @which return(left(@sourcemessage, @where - 1))
set @sourcemessage = right(@sourcemessage, (len(@sourcemessage) - @where))

end --while

--should not execute this but a return is required as the last statement
return @sourcemessage
end --function

-- Brian Knight



New Community Website

I'm proud to be launching with some partners of mine JumpStart TV. The community will have loads of how-to videos on a variety of topics. We'll be placing two 10 minute videos a week on the site that will teach you visually how to perform a function in a Microsoft technology like SQL Server or Team Foundation Server. Later next month, watch for other types of channels on the same site. The site is free but does require a registration. Here are two videos I did to kick off the site. Please keep in mind that the site is in its beta stages and may have bugs.

Date Published: 1/29/2006 4:10:21 PM
Author: Knight, Brian
Description: In this video, Brian shows how to use SSIS to make a simple transformation occur of your data. He aggregates and sorts the data before writing the data to a flat file. After you watch this video, you will know how to use the Data Flow task inside of SQL Server 2005 SSIS.
Rating: 4.25
Date Published: 1/29/2006 4:08:31 PM
Author: Knight, Brian
Description: So, you’ve just installed SQL Server 2005 but don’t know how to configure it. This video will instruct you how to use the SQL Server 2005 Surface Area Configuration tool to turn on SQL Server features like CLR integration and how to enable ports.
Rating: Unrated
-- Brian Knight