Talk about SSIS 2005

DTS xChange Download Available

The built-in wizard to migrate DTS packages is approximately 40% successful and does not allow you to enjoy the benefits and best practices of SQL Server 2005 Integration Services (SSIS). DTS xChange is a product to help you migrate from SQL Server 2000 DTS to SQL Server 2005 or 2008 SSIS. It does this by applying a series of rules and best practices to your old DTS packages as it migrates the packages. The product was engineered to migrate thousands of packages with little intervention. DTS xChange is broken into three components: profiling, migration, and enterprise logging.

The program is now released and you can download a trial that will convert up to 10 packages (2 packages each time you open the program, up to 5 times) for free and profile all of your DTS packages.

See more information and download the 32 bit trial here:

File Properties Task

This SQL Server 2005 task reads the properties of a file and writes the values to a series of variables. For example, you can use this task to determine if the file is in use (still be uploaded or written to) and then conditionally run the Data Flow task to load the file if it's not being used. You can also use it to determine when the file was created in order to determine if it must be archived.

More information can be seen here:

Migrating DTS packages to SSIS

I've been working for years on developing patterns for migrating DTS packages to SSIS. Finally, we're in the final few weeks of QA work on a new product called DTS xChange that will migrate DTS packages to SSIS and apply best SSIS practices to the package. This product also migrates items that the existing wizard will not migrate such as Dynamic Properties Tasks and connections that use UDL files. It also applies a series of SSIS rules during the conversions like turning on logging. See more information here:

SSIS Stumper Volume 1

I've started to write a new series called SSIS Stumpers. Each stumper gives you an intermediate to expert-level business problem to solve with SSIS. Sample data is provided and you're asked to load the data in a challenging way or performance tune a package. Each stumper gives you the requirements at first and you can start there or turn the page to see some hints. If that's not good enough, you can see a step-by-step instruction on the next page.

The first one in the series, your given an example where you'll need to unpivot data using canned components in SSIS. You can download this SSIS Stumper magazine for free with all the data and complete project at If you'd like a step-by-step video on how to solve the problem, see my video here:

-- Brian Knight

Favorite BI Features in SQL Server 2008

If you haven’t heard yet, CTP 3, which is a public beta of SQL Server 2008 has now been released and can be downloaded at the Connect website. After a bit of unscientific research, here’s my top 5 BI features in SQL Server 2008. It’s ordered in my priority order. It’s important to note that some of these features have not been introduced in CTP 3 yet but will be coming soon.

  1. No more IIS requirement for SSRS! – Finally, there’s not the overhead of having to install IIS as part of the popular reporting tool.
  2. CDC and Merge – While this is a database engine improvement, it will tremendously help your BI effort in SSIS. The ability to do Change Data Capture on a SQL Server table will help warehousing scenarios and reduce your load time by a factor of 9x.
  3. SSRS scalability – The reporting engine has been significantly improved to scale to million row reports. Also introduced is a best of both worlds table and matrix style report called tablix.
  4. SSIS thread scheduler improvements – In some early testing, the thread improvements have netted a 20-70% improvement in data flow performance. The main improvement is going to be with a long list of synchronous transforms.
  5. Authoring Reports in Excel and Word – Thanks to some smart licensing of Soft Artisians’ product suites, you’ll now be able to author a report in Word or Excel. I would expect this to be a pretty big blow to competing reporting platforms since users will be able to write reports in where they spend most their time in.

-- Brian Knight

Free Hotel or Disney Tickets For the June SSIS and Perf Tuning Class

We're coming up on one year in business and that's cause for a minor celebration! Most of you have probably heard that a high percentage of startups fail within their first year - we're pleased to have made it over that hurdle. We've learned a lot of lessons and probably still have bunch more to go, but it's been fun, challenging, and occasionally intimidating. If you ever get the chance to go solo, try it! You'll be out of your comfort zone and it may not succeed, but it's still a great growth opportunity.

Free Hotel Stay or Disney in June

But back to the celebration. Sign up for the June 11-14 Performance Tuning or June 18-21 Integration Services class between May 21 and June 18, 2007, and you can have a choice of a $250 Disney Gift card (bring the family!) or four free nights in a hotel. Just enter GIFTCARD or HOTEL in the customer comments field during check out. This gives you the chance to visit our Orlando training center for about the same cost as attending a local class since we pay for the hotel for up to four nights! Alumni, don't forget you can receive a 10% discount on any of our classes - just ask for it! June is a great time to visit Florida and the evenings are the best time to enjoy the local attractions. Contact us at with questions about the offers, limitations, restrictions, etc.

Upcoming Classes

In our last update we announced two new offerings; End to End Business Intelligence and One on One Mentoring. We've had a lot of interest in both and a lot of requests for variations of the One on One Mentoring. We're also working on a free SQL event in Orlando similar to a Code Camp that will be targeted at SQL professionals. If it works out, we may try it in other cities. Below is a quick synopsis of our schedule, you can view the full schedule here.

Integration Services Jun 18, Aug 27, Dec 17

Performance Tuning May 21, Jun 11, Jul 16, Aug 20, Sep 10, Oct 15

End to End Business Intelligence Jul 30

One on One Mentoring Jun 4, June 25, Jul 9

Analysis Services Aug 13

See the full schedule and registration on

-- Brian

Expert SQL Server Integration Services Released

My latest book just arrived and is on the virtual shelves of Amazon now. Expert SQL Server Integration Services was a book I wrote with Erik Veerman. The book's goal is to take you to the next level of SSIS and assumes you already know the basics. If you like it, please review it on Amazon :). Soon, we'll be putting up a chapter online so you'll be able to get a preview. Half the book is dedicated to data warehousing with SSIS and the other half is building solutions.

Upcoming SSIS and SSAS Classes

Untitled 1 table { font-size: 1em; } A:link { color: #265CC0; text-decoration:"underline"; }

I've been busily writing a new SSAS class this month that I'll be personally training it on 4/23. Below is the entire schedule for upcoming classes from End to End Training and you can also contact me for personal training or consulting if you prefer that model. The End to End facility is really amazing and Andy Warren, who manages the facility does a fantastic job. I hope to see you there!

Course Title Start Date End Date Instructor Order Link
End to End Analysis Services 4/23/2007 4/26/2007 Brian Knight Register Now
End to End Integration Services 5/7/2007 5/10/2007 Brian Knight Register Now
End to End Performance Tuning 5/21/2007 5/24/2007 Andy Warren Register Now
End to End Performance Tuning 6/11/2007 6/14/2007 Andy Warren Register Now
End to End Integration Services 6/18/2007 6/21/2007 Brian Knight Register Now
End to End Performance Tuning 7/16/2007 7/19/2007 Andy Warren Register Now
End to End SQL Server High Availability 7/23/2007 7/26/2007 Brian Knight Register Now
End to End Integration Services 8/6/2007 8/9/2007 Brian Knight Register Now

Expert SQL Server 2005 Integration Services Book

First, I want to thank you for all of your support on my first SSIS book (Professional SQL Server 2005 Integration Services). Thanks to all of you guys, Wrox has approached a few of us about continuing the book and taking it to the next level. Erik Veerman and I are collaborating together in a new SSIS book by Wrox called Expert SQL Server 2005 Integration Services. The book is not a reference book. It's a book about developing SSIS solutions. For example, we have a complete solution on restartability and building a warehouse ETL solution from the ground-up. I hope you enjoy it and we'll be complete with the writing part this month and it will be on the shelves in the April timeframe. -- Brian Knight

SQL Server 2005 Service Pack 2 CTP Released

Service pack 2 released today to the public in CTP format and can be downloaded here: Along with bug fixes, there are tons of new features. What impressed me most was I was able to install the SP without a reboot! Vista/Office Fixes:
  • Support for the upcoming Windows Vista.
  • Data Mining Add-Ins for Office 2007, which enables data mining functionality from SSAS to be used directly within Excel 2007 and Visio 2007.
  • SSRS integration with MOSS 2007, which allows integration with the Report Center in SharePoint providing seamless consumption and management of SSRS reports within SharePoint.
  • SSAS improvements for Excel 2007 and Excel Services relating to performance and functionality.
Heterogeneous Environments (Interoperability):
  • Oracle Support in Report Builder. Users will now be able to use Report Builder on top of Oracle data sources.
  • Hyperion support with SSRS. Organizations will now be able to use SSRS to build reports on top of Hyperion Essbase cubes.
  • Data compression (varDecimal), which is important for data warehouse scenarios and is specifically important for SAP BW scenarios. This requires less disk storage of decimal data which increases overall performance.
  • Manageability enhancements. Based on customer feedback, SQL Server provides enhanced management capabilities for DBAs such as improvements in database maintenance plans, enhanced management reports and a new copy database wizard.

Transactions in SSIS How To Video

In this SSIS demonstration, you can wrap a package or container in a MS DTC transaction. This type of transaction ensures that if a problem occurs in a package, your data can roll back to a prior state.

Using Transactions in SSIS (watch the video)

Brian Kngiht

Looping through a ADO Resultset in SSIS How-to Video

A while ago I write a quick post on how to loop through an ado recordset in ADO. It seems to be a common question on the newsgroups so I decided to produce a how to video on how to loop through an ADO recordset in SSIS.


Author: Knight, Brian
Description: In this video, you'll learn how to use the For Each ADO enumerator to loop through a table and reconfigure the package based on the rows in the table. You can use this type of example to store meta data about your client and reconfigure the package for each client


End to End SSIS Classes

Here's an updated schedule of the upcoming SSIS classes


Price: The classes run  $2,200 for 4 days.


Orlando Classes:(Orlando class information at

  Class Dates Cost Instructor Seats Available
  7/24/2006 - 7/27/2006 $2200 Brian Knight Sold Out!!!
  8/21/2006 - 8/24/2006 $2200 Brian Knight Sold Out!!!
Register 10/2/2006 - 10/5/2006 $2200 Brian Knight 1 left!
Register 12/18/2006 - 12/21/2006 $2200 Brian Knight Yes
Register 1/8/2007 - 1/11/2007 $2200 Brian Knight Yes
Register 2/5/2007 - 2/8/2007 $2200 Brian Knight Yes

SSIS 64 Bit Gotchas - Part 1

I'm at a client engagement this week, which has been extremely fun in learning the more subtle issues isses with 64 bit with SSIS. The first gotcha I alread knew but need to make sure you have a heads-up on is that Visual Studio is only a 32 bit application, which means you will not see BIDS on a 64 bit box by default. You can see Managment Studio though.

I was doing performance testing of the DB2 driver and noticed that it was working great until I deployed to the SQL Server's IA64 server. Once I deployed, and executed the package by using Management Studio from the Package Store, it took anywhere from 40%-100% slower. After much tormoil and some great help from some great people, we found out that when you execute a package from the Package Store in Management Studio on the box, it uses the 32 bit version of DTSExec.exe. The marshalling process of going back and forth between 64 bit and 32 bit caused a larger slowdown. Problem was fixed upon scheduling the package or copying the command out of DTSRunUI and executing DTSRun.exe from the command prompt, which uses the 64 bit version of the utilitiy. After that, everything worked great!

We've found loads more gotchas so more to come!

-- Brian Knight

New SSIS, SSAS, and SSRS Classes Available

Wow, I've broken the number one blog rule. I've been busily writing these three BI classes, that I haven't had time to eat, much less blog. My apologies! Well back from the long lull, the classes are now complete and will be given in Jacksonville (Jax) and other cities around the country soon. Here's the upcoming schedule. Each topic is two days in total and the combo makes 4 days. After giving the SSIS class a number of times, we're expanding it to give you more hands-on topics so when you leave the class, you will truely know how to implement solutions.

Designing and Implementing SSIS Solutions  - July 18th - 21st in Jacksonville
Designing and Implementing SSIS Solutions - July 24th-27th in Orlando
Designing and Implementing SSIS Solutions - Oct 2-5, 2006 in Orlando

--- End Series 1 ----

SSIS / SSAS August 22nd - 25th in Jax
SSIS / SSRS Sept 19th - 22nd in Jax
SSAS / SSRS Oct 24 - 27th in Jax

Email for any of these three above classes.
--- End Series 2 ----


Price: The classes run  $2,200 for 4 days.

-- Brian Knight


Service Pack 1 Released!

Service Pack 1 is here and it fixes loads of items. Some of which were quite embarassing when training or developing a solution with a client standing behind you. The largest enhancement is the addition of database mirroring, which allows you add high availability to your database by sending transactions to another server in real-time. There are loads of SSIS fixes and enhancements. Add a quick rolled up level, here are the SSIS fixes:

Usability of the Import/Export Wizard has been improved in multiple-table scenarios.
The IDtsPipelineEnvironmentService service lets custom data flow components have programmatic access to the parent Data Flow task.
Interoperability with Analysis Services has been improved. The DataReader source in SSIS supports the System.Object data type by converting columns that have this data type to the DT_NTEXT ssISnoversion data type. To change the data type to a type that is more appropriate for your data, you can add a Data Conversion transformation.
Performance has been improved for many transformations, such as the Sort transformation.
Designer usability has been improved. For example, you can now right-click Dataflow and then click Execute Task to execute only the Dataflow task. You do not have to switch to the control flow to execute only the Dataflow task.
The Expression Builder dialog box now has a public and documented API. This increased access to expression-related objects will benefit task developers because expressions are important to the way that tasks are interrelated.
We have added a registry-based policy for handling SSIS package signatures as a package is executed. For example, administrators can disable loading of unsigned packages and untrusted packages. Because these settings are in the registry, administrators can propagate these settings in the domain by using Microsoft Windows Group Policy policy settings.
In the Advanced Editor dialog box, the Flat File source has the new property UseBinaryFormat. This property supports loading packed decimal data into the pipeline for processing by a script or by a custom transformation.
In the Advanced Editor dialog box, the DataReader source has the new property CommandTimeout. You can use this property to modify the time-out period to allow for long-running operations.
To create or to modify the property expressions of variables, you can now open the Expression Builder dialog box from the Properties window.
You can now add annotations to precedence constraints.

It's certainly a must download for anyone using 2005.

-- Brian Knight

Using a Script Transform in SSIS

In SSIS, the data you receive from the source is rarely in the format your end users would like to see. To correct this, you can use the Script Component to transform data. In this demo, Brian shows you how to use the Script transform to make a column proper case in the data flow (free registration required).

-- Brian Knight

Using Checkpoints in SSIS

SQL Server Integration Services checkpoints enable you to start a package from where it failed. This helps you avoid having to re-run a 6 hour package over again. Learn how to use this valuable feature with this quick how-to video (registration required).

-- Brian Knight


Parsing the OnPipelineRowsSent

The OnPipelineRowsSent event functions in the data flow to log how many rows go between step to step in the pipeline (or from input to output). Once you have a data flow task, you can right-click in the design pane and select Logging. Configure the logging provider and now notice in detail you have OnPipelineRowsSent. This will now create a record for each transform (source or destination) that the data goes through to show how many rows were sent through the pipeline. It writes this though in a single message column pipe delimitted. So, you'll need to write some sort of view or routine to parse out the critical data you need. Here's the query that's from one of the SSIS report packs to parsse this into a readable format that I use:

SELECT     source, sourceid, executionid, ssrs.ParsePipeline(message, 1) AS PathID, ssrs.ParsePipeline(message, 2) AS PathIDName,
                      ssrs.ParsePipeline(message, 3) AS ComponentID, ssrs.ParsePipeline(message, 4) AS ComponentIDName, ssrs.ParsePipeline(message, 5) AS InputID,
                      ssrs.ParsePipeline(message, 6) AS InputIDName, CONVERT(int, ssrs.ParsePipeline(message, 7)) AS RowsSent, starttime
FROM         dbo.sysdtslog90
WHERE     (event LIKE 'onpipelinerowssent%')

Here's the function needed to do the parsing:

--! CREATE FUNCTION (for OnPipeLineRowsSent report)
-- function to parse OnpipelineRowsSent log entry from SSIS sysdtslog90 table
-- Function parses one log entry at a time so a view is also needed (below)
-- to call\use the function for each log entry.

CREATE function [ssrs].[ParsePipeline] (@message varchar(8000), @which int)

--should be created in the DB with desired sysdtslog90 table

returns varchar(200)

as begin

--@which defines which value is desired

-- 1= PathID
-- 2= declare PathIDName
-- 3= declare ComponentID
-- 4= declare ComponentIDName
-- 5= declare InputID
-- 6= declare InputIDName
-- 7= declare rowssent
declare @sourcemessage varchar(600)
declare @where as integer
declare @mycounter integer
If @which < 1 or @which > 7 return null
set @mycounter=0

--catch older versions of the messages that lacked the extra parameters

if patindex('%: :%', @message) = 0 return null

--chop the initial wordy stuff out
set @sourcemessage = right(@message, len(@message) - patindex('%: :%', @message) - 3)

--loop through occurances of : until we get to the desired one
set @where = 99
while @where <> 0 begin
set @mycounter = @mycounter+1
set @where = patindex('%:%',@sourcemessage)
If @mycounter = 7 return @sourcemessage
if @mycounter = @which return(left(@sourcemessage, @where - 1))
set @sourcemessage = right(@sourcemessage, (len(@sourcemessage) - @where))

end --while

--should not execute this but a return is required as the last statement
return @sourcemessage
end --function

-- Brian Knight



Looping through a ADO Resultset in SSIS

 Sometimes there is a special case where you’d like to use the results of a query in SSIS and loop through it record by record to perform a series of tasks. You should use these types of incidents very carefully since SSIS wasn’t truly meant to do this over thousands of records. This is useful however if you have a client table and you wish to loop through that table and execute an Execute Package task to load that client’s data. There are many other uses I’ve found for this but the client example is probably the central one.


Let’s try a quick example using the AdventureWorks database to execute a Script task inside the loop for each variable. To configure this, let’s first go ahead and create a connection to the AdventureWorks database. With the connection created, we’re going to pull data from a small table (HumanResources.Shift) to do a quick demonstration.


Next, create two variables with no default value. The objHumanResource variable will have the data type of Object and the strShift variable will be a string. You can create a variable by right-clicking in the design area and selecting Variables. Make sure the variables have the scope of the entire package and not an individual container or task.


Drag an Execute SQL task onto the design pane. Name the task Read Shift Table and point it to the AdventureWorks connection. Use the following directly inputted query:


SELECT * FROM HumanResources.Shift.




In the General page of the Execute SQL task, select Full result set for the ResultSet option. This option tells the task that you wish to output the results of the query to some variable. Go to the Result Set page and add a new line in the Result Set grid by clicking Add. For the Result Name column, replace NewResultName with 0 and set the Variable Name column to User::objHumanResource. The final screen should look like the below screenshot. This is telling the task that you wish to output the first result set from the query (0) to the variable.



Drag over a Foreach Loop Container and connect the green arrow from the Execute SQL task onto the container. In the Collection page, select Foreach ADO Enumerator. Select the source variable of User::objHumanResource and the mode should be set to Rows in the first table.




For the Variable column, select User::strShift. For the Index column type 1. This is making the container output the 2nd column from the rowset into the variable as it iterates through the loop. If you wanted the first column, you'd use the Index of 0.



Now, you can do whatever you want inside the container. For my example, I placed a Script task inside the container and made a little Hello World example by using the following script (and of course passing in the strShift variable in the ReadOnlyVariables option of the Script task):




That’s all there is to it. Again, you would never want to use this on thousands of records but dozens of records would scale nicely. The main use I use it for is to loop through a list of clients in a multi-client ASP type environment and perform a series of actions with it.


You can download the package here to get you started.


-- Brian Knight

4 day class on SSIS and SSAS

I've been slowly writing over the past few weeks because I'm writing an upcoming class on Feb 20-23rd on SSIS and SSAS in Jacksonville. The class will be an ongoing class that will be given but its normal price will be more than two times what we’re offering it for on these 4 days. For this class, you can attend all four days for $984. I’m writing the material and giving the class myself and have attached the outline below to give you an idea of what we’ll be covering. There are a total of 16 seats in the class so it should be a great environment for you to get your fill.   

Attendees will receive two books (one that I’m quite partial to): 

Applied Microsoft Analysis Services 2005 : And Microsoft Business Intelligence Platform

Professional SQL Server 2005 Integration Services

If you’re interested, please register fairly quickly since it’s about 1/4 full already. To register or for more information, please call or email Amanda Napolitano (see contact info in the below flyer):

-- Brian Knight

Pointing to a File or DestinationTable that Does Not Exist in SSIS

Of course this may seem simple to a lot of you but I keep seeing this question so I thought I'd post it for the Google monster to help others. When you open the Bulk Insert Task for example, the DestinationTable is a drop-down box. This is fine except in the scenario where you must point to a table that was created in an earlier step of the control flow. To do this, just leave the DestinationTable property blank after configuring the rest of the task. Click OK and you'll notice a red icon on the task where it looks like it has an error due to lack of configuration. Select the task and go to the Properties pane. Now, you can set the DestinationTable property through the pane by typing in the table that will be created in the future. After you set that property in the pane, the red icon disappears and you're ready to roll.

This works for almost any property with almost any task or connection.

-- Brian Knight

Professional SQL Server 2005 Integration Services Finally Ships!

Whew, what a long road but our book is finally being shipped to bookstores and out from Amazon this week. The book is a tutorial fashion book for SSIS and you can download one of the chapters in PDF format below. 

* - Buy the book on Amazon.
* -
Download Chapter 5 - Creating an End-to-End Package

-- Brian Knight

Proper Case for Data in a SSIS Script Component Transform

Well I feel like a college professor that shows the student a five page math problem and then follows up with a single button you could hit in your calculator to do the same thing. I used to hate that and now I'm that guy :). Anyways, in yesterday's post I showed how to do string conversions with the proper case (first letter in every word capitalized) by using regular expressions. While that was a pretty elogate way to do it, you can perform the same function with a simple built-in function called VbStrConv.ProperCase in the SSIS data flow. This function lets you take the input of a string and converts the string to the proper case. In the following code, you can see I have the input column called InputName and then I massage the data and output the value of OutputName.

The code then shrinks down to a single important line shown below in the commented line:

Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper

Public Class ScriptMain
Inherits UserComponent

   Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
      'This is the line that performs the magic to Proper Case.
      Row.OutputName = StrConv(Row.InputName, VbStrConv.ProperCase)
   End Sub

End Class

PS: Special thanks to Toby Rogers for catching how easy this would be to use VbStrConv instead of Regular Expressions!

-- Brian Knight

Script Transform Component to Cleanse Data

In yesterday's post, I showed you how to cleanse data using the Script Component acting as a transform. Let's take it a lot farther by doing some more advanced expressions. In this very typical scenario, you receive data that's all upper case from a mainframe. You want to cleanse the data to where the first letter of every word is upper case and everything else is lower case. To do this, drag over the Script Component and select Transform. Check the input column that you wish to send into the script. For my example, let's use the input name of InputName. I'm going to send into the pipeline a column called OutputValue. I add this column in the Inputs and Outputs page.

Once this is selected, you can select Design Script on the Script page. The script will look like something like the below script. In order to make things a bit simplier,  you'll want to import the System.Text.RegularExpressions namespace. As you can see, I'm using RegularExpressions to parse through the string and find when one word ends and another begins. You'll want to make sure you cleanse the input for Nulls before that chunk of the code executes. The MatchHandler function takes the two groups of strings and upper cases one and lower cases another.

All you would have to essentially do to use this code is change the InputValue and OutputValue columns in the code and you're ready to go.

Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Imports System.Text.RegularExpressions

Public Class ScriptMain
Inherits UserComponent

Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)

   Dim myDelegate As New MatchEvaluator(AddressOf MatchHandler)
   Dim strpattern As String = "\b(\w)(\w+)?\b"

   Dim re As New Text.RegularExpressions.Regex( _
   strpattern, RegexOptions.Multiline
Or _
   RegexOptions.IgnoreCase _

'Will check to see if there is a NULL value before trying the function.

If Row.InputName_IsNull = False Then
 Row.OutputName= re.Replace(Row.InputName, myDelegate)
End If

End Sub

Private Function MatchHandler(ByVal m As Match) As String
   Return m.Groups(1).Value.ToUpper() & m.Groups(2).Value.ToLower()
End Function


It's important to note that I'm not a coder so I'm sure my code may not be the most prestine. :)

For more on how to use the Script Component as a transform, see yesterday's post.

Update: A much cleaner approach to doing proper case was sent to me by Toby Rogers and can be seen in this post.

-- Brian Knight

How to use a Script Transformation

Using a Script transformation is a handy way to extend what transforms are available to you. For example, picture social security data that goes into your Script transformation unencrypted and then out encrypted. Typically, you can also use this type of custom script to extend what a Derived Column task can do. If you have the choice between the two though, always choose a Derived Column.

A good extension of a Derived Column transformation into a Script transformation would be where you need to replace certain text with other text. If you have only two options, you could use the REPLACE expression inside a Derived Column transform. When you go above two, the Replace expression becomes less elegant. So, in comes the Script transform.

After connecting the Script transform to the data flow, go to the Input Column task for the transform and check the columns that will be sent into the script as an input. Next, go to the Inputs and Outputs tab and click Add Output. Name the columns that you wish to output. I’m going to call this one OutputValue for this example.  Next, click Design Script.

In this script you can see that first we localize the variable. Because I can’t predict whether the user inputted mixed, lower or upper case, I use the UCASE function to upper case the input value for comparison purposes. The word InputValue in Row.InputValue would be replaced with whatever you checked in the input screen. Then, the main thing to remember is Row.OutputValue will set the OutputValue output we set earlier will be set to the new value. So this code will take the input of whatever the color is and translate it to the hex value. Then, the OutputValue output will be presented to the pipeline and can be consumed by the destination or the next transform in the pipeline.

Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper

Public Class ScriptMain
Inherits UserComponent

Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)

Dim HexValue As String

Select Case UCase(Row.InputValue)
Case "RED"
      HexValue = "#FF0000"
   Case "YELLOW"
      HexValue = "#FFFF00"
   Case "GREEN"
HexValue = "#008000"
End Select

Row.OutputValue = HexValue
End Sub

-- Brian Knight 

Renumber PackageIDs

I'm now a huge advocate of using package templates. For more info on how to use them see this post. Template provide a handy way to determine the best practices for your company and reuse those over and over again. Well there's one slight problem with using them. The package that uses the template inherits the PackageID of the template. Why is this a problem? Well, if you're using the System::PackageID variable for logging, all of your packages now will report that they are the template package.

So, you have two options. Ultimately, you must renumber each package to a unique value. You can do this manually in the package Properties pane. You can also do this in an automated method by using the dtutil.exe application and the /I switch as shown below:

dtutil.exe /I /FILE "PackageName.dtsx"

The best way I feel is to create a batch file with the following code that will loop through every package in a directory and renumber it:

for %%f in (*.dtsx) do dtutil.exe /I /FILE "%%f"

Please note though that once you use DTSutil.exe to do this, it will re-arrange your package. Don't worry, your package will still work but the appearance may be a bit rearranged.

Update 1/7/06: If you think this behavior should be fixed, please vote for this to be added into a service pack here:

-- Brian Knight

Great SSIS Best Practice Post

Jamie Thomson has posted a great blog post about SSIS development best practices. I love his naming standard idea:

Jamie recommends that you prefix each task or transform with a prefix that indicates what type of task or transform it is. For example, if you have a task that loads an inventory table, you would call the Data Flow task "DFT Load Inventory". This obviously would not help you at design time as you can see by the icon what type of task it is. It would however help you and the operations folks at your company would be able to track down what type of task failed easily. For example, if you use a log provider of some sort to output all the events, the Source Name will show the task type versus just Load Inventory (and the ops guy wouldn't know what this task is).

-- Brian Knight 

Integration Services Error and Message Reference

Finally, it has arrived. Well 2006 has arrived almost....whooohooo. It's just not as sexy as Y2K though, is it?

The list in the below link shows all the predefined Integration Services errors, warnings, and informational messages, in numerical order within each category, along with their numeric codes and symbolic names. You can use this to properly trap for errors and provide better user errors.

-- Brian Knight

Professional SQL Server 2005 Integration Services Sample Chapter

In a few weeks, my new book will ship out called Professional SQL Server 2005 Integration Services. The book was a collaborative effort that involved 9 other fantastic authors incuding 3 SQL Server MVPs. The book is in tutorial form so after you're introduced to a concept, you're walked through it in example form. Special thanks to all the SSIS team at Microsoft for tech editing the book!

You can now download a sample chapter below, which contains a large tutorial that may help you get started with SSIS. I love using this example in my speaking sessions because it shows the bulk of the features from SSIS and is a real-world scenario involving looping and archiving of files. Hope you enjoy!

-- Brian Knight


* - Buy the book on Amazon.

* - Download Chapter 5 - Creating an End-to-End Package

Setting Variables in the Script Task in SSIS

One of the key reasons that you use the script task is to change the value of a variable at runtime. There is a lot of real-world scenarios that you would use this for. If you're reading this you have already probably thought of a few. To set a variable in the script task, there are two main methods you can use.

Method 1

This method involves using the LockOneForWrite method in the VariableDispenser class. The advantage to this method is it allows for you to read and write to the variables at runtime without having to use the ReadOnlyVariables and ReadWriteVariables options in the Script task. The price of that though is that you have to write quite a bit more code. In the following example, I'm going to open the AlertAdmin variable for writing and then set it to the boolean value of True.

Public Sub Main()
Dim vars As Variables
"AlertAdmin", vars)
   vars("AlertAdmin").Value = True
   Dts.TaskResult = Dts.Results.Success
End Sub

Method 2

The second method is typically what I would recommend using just for simplicity. In this method, you would open the Script task and before clicking on Design Script, you must ReadOnlyVariables and ReadWriteVariables options. If you have more than one variable you wish to be available in the Script task, you can seperate them with commas. If you do not set this, you will not see an error at design time but at run time, you'll receive the error shown below: 

The element cannot be found in a collection. This error happens when you try to retrieve an element from a collection on a container during execution of the package and the element is not there.

at Microsoft.SqlServer.Dts.Runtime.Variables.get_Item(Object index)

at ScriptTask_8693feb81f2d4b7b8a26ae87a8a5f960.ScriptMain.Main() in dts://Scripts/ScriptTask_8693feb81f2d4b7b8a26ae87a8a5f960/ScriptMain:line 19

With the variables now being passed in, you can perform the same type of action as I showed in the earlier script in a single line (the second line shown below). The locking is done by the Script task UI.

Public Sub Main()
"AlertAdmin").Value = True
   Dts.TaskResult = Dts.Results.Success
End Sub

The ReadOnlyVariables and ReadWriteVariables options are available to you in the Expressions tab too if the variable name that you wish to pass in is unknown or dynamic. There are other ways to set the variables of course. Most tasks have hooks into the variables like the Execute SQL task. Hope this helps!

-- Brian Knight 

Templates in SSIS

Templates in SSIS provide a great way to create standards across your company or enterprise. For example, you may want to enforce a standard annotation set of notes at the top of each package or have each package come with a standard set of connections or error handlers to speed up development. To use a template, first create a package just as you would want to see it as a template. Add all the connections, tasks, comments or log providers. After the package meets your needs, copy it to the following directory (of course replace %Program Files%) : %ProgramFiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies\ProjectItems\DataTransformationProject\DataTransformationItems

For a developer to consume the template, he must only right-click at the project-level node in BIDS and select New Item. You'll see the package you copied over there and once selected, all your information you created earlier is ported over.

-- Brian Knight

What Log Provider Type is Best in SSIS?

SSIS has the ability to do very detailed logging that puts DTS Package Logs to shame. In the past, you would have to had to write customized logging routines to do what SSIS does out-of-the-box. SSIS offers a number of logging providers that you can select by going to SSIS menu and selecting Logging. You must then chose what type of provider you wish to use for logging. Which provider you chose depends on your goals:

  • SQL Server Profiler create a profiler trace file - This option gives you some interesting ties in with Profiler if you're examining performance issues like long-running packages. With this seamless integration, you can quickly tie the log to a System Monitor trace to see performance counters.
  • SQL Server Table writes events to a SQL Server table. -  This option is my personal favorite since it allows for rich reporting using Reporting Services or the reporting tool of your choice. This is a must-have for most environments that need operational reports in real-time. I like to create reports against the table that refresh so you can see all the running packages and their progress.
  • Windows Event Log writes each event to the Windows Application log - What makes this choice a nice one is it creates hooks for monitoring products like Tivoli or Microsoft Operations Manager (MOM) to use for alerts.
  • XML File format writes each event to a structured XML file -  This can also be used for reporting on a website if you use an XSLT file for formatting. It works well if you need to share the information with a 3rd party especially.
  • Text file writes to a simple flat file - This option is the easiest to configure and is the most understood by traditional tech-heads.

What's especially nice about the new logging mechanisms is you don't have to chose just one. You can use for example the table for relational reporting and the Windows Event log for operational monitoring tools.

-- Brian Knight

Professional SQL Server 2005 Integration Services Book Available


Professional SQL Server 2005 Integration ServicesIn mid-January, my new book comes out called Professional SQL Server 2005 Integration Services from Wrox. You can pre-order it today on or go to Amazon to get the better price :). This book will help you get past the initial learning curve quickly so that you can get started using SSIS to transform data, create a workflow, or maintain your SQL Server. Offering you hands-on guidance, you'll learn a new world of integration possibilities and be able to move away from scripting complex logic to programming tasks using a full-featured language.

What you will learn from this book

  • Ways to quickly move and transform data
  • How to configure every aspect of SSIS
  • How to interface SSIS with web services and XML
  • Techniques to scale the SSIS and make it more reliable
  • How to migrate DTS packages to SSIS
  • How to create your own custom tasks and user interfaces
  • How to create an application that interfaces with SSIS to manage the environment
  • A detailed usable case study for a complete ETL solution

-- Brian Knight 

Upgrading DTS to SSIS

Next week I have a new article on about upgrading your DTS packages to SSIS. If you're reading this, you can be the first to see it by clicking here.

-- Brian Knight

SSIS Packages are Encrypted by Default

By default, SSIS files in development are encypted to prevent an unauthorized person from seeing your SSIS package. The type of encyrption is seamless behind the scene and is at a workstation and user level. So, if you were to send a package that you're developing to another developer on your team, he would not be able to open it by default. The same would apply if you logged in with a different user. You would receive the below error:

There were errors while the package was being loaded.
The package might be corrupted.
See the Error List for details.

The error is very misleading. In truth, you can't open the package because the originating user encrypted the package whether on purpose or not. To fix this, the owner of the package can open the package and in the Properties pane, select a different option (like a package password) for the ProtectionLevel option. The default option is EncryptSensitiveWithUserKey. To protect the entire package with a password, select the EncryptAllWithPassword option.

An option that I like is that a SSIS designer encrypts all packages with the default option and when he's ready to send to production, he can develop a batch file to loop through a directory's .dtsx file and set a password. The batch file would use Dtutil.exe and look like this: 

for %%f in (*.dtsx) do Dtutil.exe /file %%f /encrypt file;%%f;3;newpassword

This would loop through each .dtsx file in yoru directory and assign the password of newpassword. The production support group could then use the same batch file to reset the password to a production password.

-- Brian Knight

SQL Server 2005 Business Intelligence Metadata Samples Toolkit Released

For the past few months, I've been working on a whitepaper for Microsoft on metadata. The whitepaper evolved as did the product. Well, finally both the product and the whitepaper have been released! So what does it do? Well it's a starter kit that provides you all the code for both viewing and analzying your enviornment's metadata. For example, if you have dozens of SSIS packages in your environment that all load a similar table, you can use this tool to see what the impact would be if you altered the table or the package as well as see the lineage of how the data got into the table. The kit includes:

  • DependencyAnalyzer: a utility that scans SSIS packages and SSAS databases for enumerating metadata (objects, properties and dependencies).
  • DependencyViewer: a utility to view metadata organization & inter-dependencies (lineage/impact analysis).
  • Analysis Services Samples : sample SSAS packages that can be analyzed
  • Integration Services Samples: sample SSIS packages that can be analyzed
  • Report Model Samples (Report Builder): report builder model
  • Reports: sample reports built from metadata store
  • SQL: SQL script for creating metadata store

Download the Toolkit:

I did a quick video also on how to use it that can be downloaded in MPG format here. Downloader beware though! The video is not optimally compressed and runs 80 MB. But, if you have a high-speed internet connection here's the link:

-- Brian Knight

DTS Designer Doesn't Exist in SQL Server 2005 by Default

Once you upgrade your tools to SQL Server 2005, you can't manage by default SQL Server 2000 packages. The answer is to download a piece of the SQL Server 2005 Feature Package (, which includes the DTS Designer. After this installation, you can update packages just like you were in Enterprise Manager. They didn't include this by default to reduce the surface area of the installation.

After you upgrade your SQL Server instance you'll be able to find your 2000 packages in the Microsoft SQL Server Management Studio under the Management node then Legacy -> Data Transformation Services. In an upgrade of an instance, packages and jobs don't move so you can slowly upgrade your DTS packages using the Upgrade Wizard.

-- Brian Knight