Many people (including me) are often still using SQL 2000 – which means that you might be using DTS (Data Transformation Services) to perform utility jobs on your databases.
I’ve had a requirement to output data to Excel (not my choice – Excel usually isn’t!) – which is subtly different to text output as CSV*. Another requirement was to have a ‘generated’ file name in a known folder. This is as simple as a date/time driven file name. The ‘job’ runs once (or more) a day and so creates a new file each time. Easy right?
Well – it turns out that if you go through the normal motions of creating an Excel connection and using a Transform Data Task it’s likely to work first time and leave you thinking ‘great’. Unfortunately when you run it again you’re likely to get a message similar to the following:
Table ‘New_Table$’ not found.
After much digging (hence the reason for writing this), I discovered that in order to map the columns successfully you basically have to create the table in the spreadsheet before running the transformation (the ‘create table’ part of creating the transformation). If you’ve got a dynamic file name then the next time you’ll effectively have a blank spreadsheet without the required receiving ‘table’, and it will fail, unless you’ve recreated the sheet before the transformation.
OK so assuming you want to output some data to a spreadsheet with a dynamic filename with a Transform Data task then here’s how you do it.
- I put all files in the same folder so I use 2 global variables to build the filename – NetworkLocation and FinalLocation.
- First, Right-click on a blank part of your package screen and choose Package Properties, then Global Variables tab.
- Create the two global variables NetworkLocation and FinalLocation
- Edit the NetworkLocation value to be the folder (including a final ‘\’) of where you want to store the files.
- Leave FinalLocation blank (we’ll create that in the next step.
- Add an ActiveX Script Task with the following code (which you can
obviously tailor the name to your needs). It basically creates a
date/time based file name, and stores a concatenation of the network
location and file name in the FinalLocation global variable.‘**********************************************************************
‘ Visual Basic ActiveX Script
‘************************************************************************
Function Main()Dim fileName
fileName = “MyOutput_” & DatePart(“yyyy”,Date) & Right(“0” & DatePart(“m”,Date), 2) & Right(“0” & DatePart(“d”,Date), 2) & “_” & Right(“0” & DatePart(“h”,Now), 2) & Right(“0” & DatePart(“n”,Now), 2) & Right(“0” & DatePart(“s”,Now), 2)
DTSGlobalVariables(“FinalLocation”).Value = DTSGlobalVariables(“NetworkLocation”).Value + fileName + “.xls”Main = DTSTaskExecResult_Success
End Function
- Add your data connections
- Create your source connection (presumably SQL Server or wherever your data’s coming from)
- Create your destination connection (A Microsoft Excel 97-2000
connection) and set an initial file name (best to be in the folder
specified in NetworkLocation) - Create a Data Transformation Task connecting your source and destination connections as per normal
- Add in your Table/View or SQL Query to define the source data.
- In the Destination tab click ‘Create…’ next to ‘Table name:’ (a
dialog will pop up ‘Create Destination Table’). Change the table name
(or column definitions) if you like, then COPY THE ENTIRE TEXT – and
paste into a text editor for safe keeping. - Click OK (This will create a new sheet in the spreadsheet you’ve
specified in your connection, ready to receive your data). Your fields
should then also be populated in the table below. - Check the transformations tab – you should get a nice one to one mapping between source and destination columns.
- Create a Dynamic Properties Task (Here’s where it all comes together)
- Click ‘Add…’ to create a new property. Choose your Excel Connection and pick the ‘DataSource’ property, then click ‘Set…’
- Choose ‘Global Variable’ as the source and FinalLocation as the variable – Click OK.
- Click ‘Add…’ to create another property. Choose the
DTSTask_DTSDataPumpTask_? in the Tasks node (your data transformation
task) and pick the ‘DestinationObjectName’ property. - Choose ‘Query’ as the source and paste in your ‘CREATE TABLE‘ statement from earlier – Parse to check, then click OK.
- Create the Workflow for the task
- Select the Dynamic Properties Task, right-click and select Workflow –> Workflow Properties.
- Click ‘New’ and set the ActiveX Script Task as ‘Source Step’.
- Select the Data Transformation Task, right-click and choose Workflow Properties.
- Click ‘New’ and set the Dynamic Properties Task as ‘Source Step’.
That’s basically it – the package should look something like this:
It should also now do the following (in order):
- Set filenames (stored in global variables)
- Set filename of the excel connection, create the destination ‘sheet’
from the global variables and the query in the transformation - Do the transformation and store the results in a date/time named Excel spreadsheet in your configured folder
If you ever need to change the query (columns) then you’ll need to regenerate the ‘CREATE TABLE’ statement and reflect it in the Dynamic Properties Task (5.c)
*Typically an Excel export covers your back with datatypes where a CSV export won’t – so when you open in Excel you’ll have to format columns specifically yourself – like converting dates, and also long numbers from scientific format to text format.
Hi MSIMNER,
Thanks a lot for this brilliant post. I had been banging my head around this problem for a couple of days now. Excellent procedural explanation of the process and the diagram helped confirm my understanding.