Can You Add Output Columns To The Script Transformation Editor On The Fly?

Jun 21, 2006

Can I add Output Columns to the Script Transformation Editor using code? I have to execute a SQL Statement to determine the number of years we have the data for for an item and then create the columns for the months in those years and populate them with the quantities. So my question is can I create output columns to the Script Transformation Editor on the fly that is as it is being executed?

Any input will be good.

Thanks,

MShah

View 3 Replies


ADVERTISEMENT

XML Source Editor Columns Configuration Is NOT Persisting Output Name Selection

Oct 25, 2007

Greetings everyone,

I am seeing a particular problem in the XML Source Editor "Columns" configuration where it is not persisting the "Output name" selection.

Control Flow Tab:
1. I use a "Exec SQL Command" to drop, create, or alter the destination tables in the database that I want to be repository for the inbound XML data. The data types are fairly straightforward.

2. I add a singular "Data Flow"

Data Flow Tab:

1. I add a "XML Source" task, and assign a well-defined XML file. I then use the "Generate XSD" option in the "Connection manager"; and I am fairly satisfied with the generated XSD.

2. I create "OLE DB Destination"

3. I wire the "XML Source" to the "OLE DB Destination". In the "XML Source" in the "Columns".

4. I go to the dropdown list of "Output name" and see the list ordered with the various complex-types that I want to map and transfer to a target table.

For the sake of this report, I select the 5th one down on the list (for which I already have a target table) - let's call this "Mesh"

5. In the "Input Output" dialog, I select the "output" to be the desired 5th item, "Mesh"

6. I check all my mappings so that they map one-to-one ... XML name entries match SQL table destination mapping entries; correct types; correct size

7. Check the metadata and it all looks good.

8. When I hit "Debug" to test the package the failure occurs at the "XML Source". The error report comes back saying that it failed because "field xxx in Contributor was truncated". However, "Contributor" corresponds to the 1st name in the dropdown list presented in "Columns" "Output name:".

If I select return to Step 4, when I open up "Columns" I see that my previous selection of the 5th item on the list named "Mesh" was not persisted, but invariably and no matter how often I select item #5 "Mesh" and save to ensure that selection sticks, it is not persisted.

I hand-edited the .dtsx file and only then was I able to make this stick. However, if I ever re-save the package this non-persistency pops up again.

Am I doing something wrong here or is this a known defect? As I have several dozen XSD mappings that I want to transfer to tables, hand-editing is not something I relish.

I look forward to your reply.
RudyC

View 1 Replies View Related

Get A List Of Output Columns On Script Transformation

Feb 13, 2007

I am using a script component to transform data. In the script component I created a bunch of fields for the output. Is there any way to loop through that list of columns? Is there code I can use in the script component to access the names, data types, data etc?

I saw a lot of informaiton on the OutputColumnCollection as part of some IDTSOuput90 thing (greek to me). As best I can guess this is for creating your own new columns, but can I see what columns are already defined via the script interface?

View 2 Replies View Related

Tips On Creating Output Columns In A Custom Transformation

Aug 14, 2007

I would like my transformation to automatically create an output column for each input column. Any tips? I can't seem to determine which event to listen to or method to override.

View 3 Replies View Related

Derived Column Transformation Editor

May 8, 2008

Greetings, I am attempting to create a flat file delimited by |. I am using (ISNULL(LIN1_OPT_ADDR) ? "" : LIN1_OPT_ADDR + "| ") to replace the blank address column with the pipe delimiter. So that a row that would consist of:

Customer Number,Name,Address Line1,City,State

12345,ACE HARDWARE INC. ,801 Rockefeller St.,New York, New York
56789,BUILDING SUPPLY INC., ,Wichita, Kansas

Should end up as:

12345|ACE HARDWARE INC.|801 Rockefeller St.|NEW YORK|NEW YORK
56789|BUILDING SUPPLY INC.||Wichita|Kansas

When I run the data flow to create the flat file the file contains the following:


12345|ACE HARDWARE INC.|801 Rockefeller St.|NEW YORK|NEW YORK
56789|BUILDING SUPPLY INC.| | |Wichita|Kansas


Can anyone tell me what I am doing wrong?

Thanks.

View 3 Replies View Related

Skip Row - Script Transformation Editor

Nov 2, 2007

Howdy!

I am reading in a deliminated file. In the Script Transformation Editor, if the UPC does not past the checksum test, I want to throw the row out right then. I am not sure how to do that...but it is probably really simple.]
Thanks,
Linda



Here is my script:



' Microsoft SQL Server Integration Services user script component

' This is your new script component in Microsoft Visual Basic .NET

' ScriptMain is the entrypoint class for script components

'Option Strict Off

Imports System

Imports System.Data

Imports System.Math

Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper

Imports Microsoft.SqlServer.Dts.Runtime.Wrapper



Public Class ScriptMain

Inherits UserComponent

Private Function DoubleTest(ByVal Value As String) As Boolean

Dim d As Double

If Not Double.TryParse(Value, d) Then

'Windows.Forms.MessageBox.Show(Value + " is not numeric")

Return False

End If

If Double.IsNaN(d) Then

'Windows.Forms.MessageBox.Show(Value + " is NaN")

Return False

End If

'Windows.Forms.MessageBox.Show(Value + " = " + d.ToString())

Return True

End Function

Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)

Dim d As Double

Dim CheckDigit As Integer = 0

Dim CheckOdd As Integer

Dim CheckEven As Integer

' Copy each source column to the destination column

Row.WholesalerCode = Trim(Row.WholesalerCode)

If Row.UPCNumber.Length = 12 Then

' 12 Digit Checksum

CheckOdd = Convert.ToInt16(Row.UPCNumber.Substring(0, 1), 10) + Convert.ToInt16(Row.UPCNumber.Substring(2, 1), 10) + Convert.ToInt16(Row.UPCNumber.Substring(4, 1), 10) + Convert.ToInt16(Row.UPCNumber.Substring(6, 1), 10) + Convert.ToInt16(Row.UPCNumber.Substring(8, 1), 10) + Convert.ToInt16(Row.UPCNumber.Substring(10, 1), 10) * 3

CheckEven = Convert.ToInt16(Row.UPCNumber.Substring(1, 1), 10) + Convert.ToInt16(Row.UPCNumber.Substring(3, 1), 10) + Convert.ToInt16(Row.UPCNumber.Substring(5, 1), 10) + Convert.ToInt16(Row.UPCNumber.Substring(7, 1), 10) + Convert.ToInt16(Row.UPCNumber.Substring(9, 1), 10)

If ((CheckOdd + CheckEven) + 1.0) / 10.0 = Round(((CheckOdd + CheckEven) + 1.0) / 10.0, 0) Then CheckDigit = 1

If ((CheckOdd + CheckEven) + 2.0) / 10.0 = Round(((CheckOdd + CheckEven) + 2.0) / 10.0, 0) Then CheckDigit = 2

If ((CheckOdd + CheckEven) + 3.0) / 10.0 = Round(((CheckOdd + CheckEven) + 3.0) / 10.0, 0) Then CheckDigit = 3

If ((CheckOdd + CheckEven) + 4.0) / 10.0 = Round(((CheckOdd + CheckEven) + 4.0) / 10.0, 0) Then CheckDigit = 4

If ((CheckOdd + CheckEven) + 5.0) / 10.0 = Round(((CheckOdd + CheckEven) + 5.0) / 10.0, 0) Then CheckDigit = 5

If ((CheckOdd + CheckEven) + 6.0) / 10.0 = Round(((CheckOdd + CheckEven) + 6.0) / 10.0, 0) Then CheckDigit = 6

If ((CheckOdd + CheckEven) + 7.0) / 10.0 = Round(((CheckOdd + CheckEven) + 7.0) / 10.0, 0) Then CheckDigit = 7

If ((CheckOdd + CheckEven) + 8.0) / 10.0 = Round(((CheckOdd + CheckEven) + 8.0) / 10.0, 0) Then CheckDigit = 8

If ((CheckOdd + CheckEven) + 9.0) / 10.0 = Round(((CheckOdd + CheckEven) + 9.0) / 10.0, 0) Then CheckDigit = 9

If CheckDigit = Convert.ToInt16(Row.UPCNumber.Substring(11, 1), 10) Then

Row.UPCNumber = String.Concat("00", Row.UPCNumber)

Else

'Throw out row because checksum did not match. <=== what do i do here???????????????

End If



ElseIf Row.UPCNumber.Length = 14 Then

' 14 Digit Checksum

Else

' Throw out row because checksum did not match. <=== what do i do here???????????????

End If

If Not DoubleTest(Row.RetailPrice) Then

Row.RetailPrice_IsNull = True

'Row.RetailPrice = String.Empty

End If

End Sub

End Class

View 5 Replies View Related

Derived Column Transformation Editor Question

Aug 30, 2006

Help...

I'm having trouble coming up with a valid expression in my derived column transformation editor that tests the input column for NULL and responds something like this:

if[message] isNull then "NA" else [message]

where [message] is the input column.

Thanks!





View 1 Replies View Related

Derived Column Transformation Editor Question

Aug 30, 2006

Help...

I'm having trouble coming up with a valid expression in my derived column transformation editor that tests the input column for NULL and responds something like this:

if[message] isNull then "NA" else [message]

where [message] is the input column.

Thanks!





View 3 Replies View Related

Transformation Editor Or SQL Eitor. How To Search And Replace?

Mar 8, 2007

Hi SSIS Gurus,

Coild anybody halp me, how to achive search and replace in the Expressions (Transformation Editor) or in SQL Query box?

If this functionality isn't presented, what is preffered woraround to achieve it?

I have a lot of transformations (about 100) in the derived column task and its too difficult to find transformation what i need.

View 1 Replies View Related

Is Possible To Call A VB.NET Function Within Derived Transformation Editor

Mar 5, 2008

Hi,

In a nut shell I want to be able to instruction some Data Analysts on how to modify SSIS packages using the simpliest solutions possible. This is because there are many different data sources and some of these data sources have a huge number of fields, and yes you guessed it these data sources are subject to change on a regular basis.

A very common task they will need to do is to modify an SSIS package to do a to transform of a source date string format of "YYYYMMDD" into a date data type field within a table.


Similar threads have advised the use of the Data Flow Transformations->Derived Column for this sort of thing.

So within the Expression Text box I have inserted the following SSIS compatible SQL to convert the above string into a british format date data type; -




Code Snippet
(SUBSTRING(DOB_SRC,8,2) + "/" + SUBSTRING(DOB_SRC,5,2) + "/" + SUBSTRING(DOB_SRC,1,4))





But really what I want to be able to do is to instruct the Data Analysts to do is something like; -

ConvertTextToDate(DOB_SRC)

Where I previously defined that behaviour of ConvertTextToDate as a public VB.NET function.

Can someone please help. I'm pretty certain I'm not the only one with this type of requirement.


Thanks in advance,

Kieran.

View 3 Replies View Related

SSIS - Derived Column Transformation Editor Expression

Aug 23, 2007

I am trying to put the following as an expression in the SSIS Derived Column Transformation Editor.

DATEADD(DAY, DATEDIFF(DAY, 0, GETDATE()), 0)

It is not allowing it. This works fine in a regular SQL statement.

Does anyone know how I can get this to work?

View 14 Replies View Related

How To Assign Null In Derived Column Transformation Editor

Mar 28, 2007

Dear friends, can any one tell me how to assign null to the expression value in derived column transfromation editor?

thanks,

View 5 Replies View Related

Number Of ROWS Of Output Of Aggregate Transformation Sometimes Doesn't Match The Output From T-SQL Query

Dec 25, 2006

While using Aggregate Transformation to group one column,the rows of output sometimes larger than the rows returned by a T-SQL statement via SSMS.

For example,the output of the Aggregate Transformation may be 960216 ,but the

'Select Count(Orderid) From ... Group By ***' T-SQL Statement returns 96018*.

I'm sure the Group By of the Aggregate Transformation is right!



But ,when I set the "keyscale" property of the transformation,the results match!

In my opinion,the "keyscale" property will jsut affects the performance of the transformaiton,but not the result of the transformation.

Thanks for your advice.

View 2 Replies View Related

Using Output From A Stored Procedure As An Output Column In The OLE DB Command Transformation

Dec 8, 2006

I am working on an OLAP modeled database.

I have a Lookup Transformation that matches the natural key of a dimension member and returns the dimension key for that member (surrogate key pipeline stuff).

I am using an OLE DB Command as the Error flow of the Lookup Transformation to insert an "Inferred Member" (new row) into a dimension table if the Lookup fails.

The OLE DB Command calls a stored procedure (dbo.InsertNewDimensionMember) that inserts the new member and returns the key of the new member (using scope_identity) as an output.

What is the syntax in the SQL Command line of the OLE DB Command Transformation to set the output of the stored procedure as an Output Column?

I know that I can 1) add a second Lookup with "Enable memory restriction" on (no caching) in the Success data flow after the OLE DB Command, 2) find the newly inserted member, and 3) Union both Lookup results together, but this is a large dimension table (several million rows) and searching for the newly inserted dimension member seems excessive, especially since I have the ID I want returned as output from the stored procedure that inserted it.

Thanks in advance for any assistance you can provide.

View 9 Replies View Related

Integration Services :: Script Transformation Editor Won't Open In Data Tools - BI

Aug 17, 2015

From SQL Server 2014, using SQL Server Data Tools for Visual Studio - BI, I'm trying to edit a Script Component within an SSIS Data Flow Task. The 'Edit Script...' button is enabled and turns a nice shade of blue when moused over, but a click has no effect. Perhaps I'm missing a component of VSTA? Everything else seems to work correctly. What might I be missing?

View 2 Replies View Related

Function That Works In Sql Server Management Studio Does Not Work In Derived Column Transformation Editor

May 12, 2007

Hi

I'm a relative SQL Server newbee and have developed a function that converts mm/dd/yyyy to yyy/mm/dd for use as in a DT_DBDATE format for insert into a column with smalldatatime.



I receive the following erros when using the function in the Derived Column Transformation Editor. First, the function, then the error when using it as the expression Derived Column Transformation Editor.



Can anyone explain how I can do this transformation work in this context or suggest a way either do the transformation easier or avoid it altogerher?



Thanks for the look see...

******************************

ALTER FUNCTION [dbo].[convdate]

(

@indate nvarchar(10)

)

RETURNS nvarchar(10)

AS

BEGIN

-- Declare the return variable here

DECLARE @outdate nvarchar(10)

set @outdate =

substring(@indate,patindex('%[1,2][0-9][0-9][0-9]%',@indate),4)+'/'+

substring(@indate,patindex('%[-,1][0-9][/]%',@indate),2)+'/'+

substring(@indate,patindex('%[2,3][0,1,8,9][/]%',@indate),2)



RETURN @outdate

END

********************************



And the error...



expression "lipper.dbo.convdate(eomdate)" failed. The token "." at line number "1", character number "11" was not recognized. The expression cannot be parsed because it contains invalid elements at the location specified.

Error at Data Flow Task [Derived Column [111]]: Cannot parse the expression "lipper.dbo.convdate(eomdate)". The expression was not valid, or there is an out-of-memory error.

Error at Data Flow Task [Derived Column [111]]: The expression "lipper.dbo.convdate(eomdate)" on "input column "eomdate" (165)" is not valid.

Error at Data Flow Task [Derived Column [111]]: Failed to set property "Expression" on "input column "eomdate" (165)".

(Microsoft Visual Studio)

===================================

Exception from HRESULT: 0xC0204006 (Microsoft.SqlServer.DTSPipelineWrap)

------------------------------
Program Location:

at Microsoft.SqlServer.Dts.Pipeline.Wrapper.CManagedComponentWrapperClass.SetInputColumnProperty(Int32 lInputID, Int32 lInputColumnID, String PropertyName, Object vValue)
at Microsoft.DataTransformationServices.Design.DtsDerivedColumnComponentUI.SaveColumns(ColumnInfo[] colNames, String[] inputColumnNames, String[] expressions, String[] dataTypes, String[] lengths, String[] precisions, String[] scales, String[] codePages)
at Microsoft.DataTransformationServices.Design.DtsDerivedColumnFrameForm.SaveAll()

View 3 Replies View Related

Can I Use * To Specify 'Output Column' For OLD DB Source Editor?

Dec 7, 2006



I am working on a situation similar to 'Get all from Table A that isn't in Table B' http://www.sqlis.com/default.aspx?311

I noticed that if one column's name of source table changes,(say Year to Year2) I have to modify all 'data flow transformations' in the task.

I am new to SSIS.



thanks! -ZZ

View 8 Replies View Related

Error Output In Oledb Command Advance Editor

Sep 27, 2007

I have a data flow and inside the data flow , i have a ole db source and ole db command task to execute an insert transaction ( SP).. i would like to save the error output if the insert didn;t happend into a error log table..
but when I darg an error output line ( red) to another ole db command to insert an error log , i can only see two columns( error code and error column) are available in OLEDB command advance editor related to errors.. this doesn;t tell you much information about the error.how can i grap the error reson(desc) as the error output and store into a erro log table? so that i can see what the problem is?

View 1 Replies View Related

OLE DB Source Editor And External Columns

Apr 23, 2008

I have query like below that I am using as a OLE DB source

Set NOCOUNT ON

Select *
Into #temp1
from A

Select *
Into #temp2
From B

Select * from #temp1 a
Join #temp2 b on a.episode_key = b.episode_key


I can see the preview data , but when I click columns, there are no available external columns..
Howcan I fix this issue?

View 8 Replies View Related

How To Get The Output Column In OLE DB Command Transformation

Jul 3, 2006



Hi,

I am writing a Dataflow task which will take a Particular column from the source table and i am passing the column value in the SQL command property. My SQL Command will look like this,

Select SerialNumber From SerialNumbers Where OrderID = @OrderID

If i go and check the output column in the Input and output properties tab, I am not able to see this serial number column in the output column tree,So i cant able to access this column in the next transformation component.

Please help me.

Thanks in advance.





View 13 Replies View Related

Error Output For A Destination Transformation

Jun 16, 2006

I am developing a custom destination component and I have encountered a few areas where there seems to be a lack of helpful documentation and examples.

1. I have not been able to find any information on or examples of creating custom destinations with an error output. The OLE DB Destination has an error output so I investigated the input and error output properties in the advanced editor and found that the OLE DB Destination error output is synchronous with the input (its SynchronousInputID matches the input's ID) and has its ExclusionGroup value set to 1. Using this information, I modeled my error output after the OLE DB Destination.

ProvideComponentProperties:
AddErrorOutput(ERROR_OUTPUT_NAME, input.ID, 1);

ProcessInput:
int errorOutputID = -1;
int errorOutputIndex = -1;
GetErrorOutputInfo(ref errorOutputID, ref errorOutputIndex);
...
buffer.DirectErrorRow(errorOutputID, 0, errorOutputIndex);

Checking the input and error output properties in the advanced editor for my custom destination component I find the following:
Input
-----
ID: 3515

Error Output
------------
ExclusionGroup: 1
ID: 3516
IsErrorOut: True
SynchronousInputID: 3515

Shortly after I start my SSIS package and it encounters an error row, I get the following exception:
[My Destination Adapter 1 [3512]] Error: System.ArgumentException: Value does not fall within the expected range. at Microsoft.SqlServer.Dts.Pipeline.Wrapper.IDTSBuffer90.DirectErrorRow(Int32 hRow, Int32 lOutputID, Int32 lErrorCode, Int32 lErrorColumn) at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.DirectErrorRow(Int32 outputID, Int32 errorCode, Int32 errorColumn) at MyDestination.ProcessInput(Int32 inputID, PipelineBuffer buffer) at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)


2. My custom destination component is used for writing a file with a fixed schema. I followed the means by which source component examples add their output columns, but applied this to my external metadata columns. In my Validate() I check if the ExternalMetadataColumnCollection.Count == 0 and return DTSValidationStatus.VS_NEEDSNEWMETADATA; to force a call to ReinitializeMetaData(). In ReinitializeMetaData() I call a method that creates the input's external metadata columns that reflect my external data source.

This works fine except every time I add my custom destination component to a SSIS Package and go to edit the component I am greeted with a dialog box that states: "The component is not in a valid state. ... Do you want the component to fix these errors automatically?" Pressing the Yes button, I assume, makes the call to ReinitializeMetaData() and I have my external metadata columns. Where is the correct place to add the external metadata columns so the user does not have to take this extra step every time they add my component to their package?

View 5 Replies View Related

Oledb Command Transformation Output

Feb 13, 2008



Hi,

I have two tables,

Table A on Server 1 (3 ROWS)
ID Name Address
ID1 A B
ID2 X Y
ID3 M N

There is another table on a different server which looks like

Table B on Server 2
PKColumn ID Details
1 ID1 Desc1
2 ID1 Desc2
3 ID1 Desc 3
4 ID2 Desc
5 ID2 Description

As you can see the ID is the common column for these two tables,
I want to get the Query the above 2 tables and the output should be dumped into a new table on Server2.

I am using the following SSIS Package

OledbDataSource-------> OledbCommand(Select * from TableB where ID =?)

From here, how can insert the rows returned from the oledb command into another table.
Since, for each row of TableA it will return some output rows...How can I insert all these into the New Table.

Please help on configuring the output of the oledb command.

Thanks,



View 5 Replies View Related

Output Param In Oledb Transformation That Calls An Sp

Feb 17, 2008

is it true that I will not be able to use the returned value from an sp that is called on every row from an oledb command transformation? I see all kinds of complaints on the web but cant determine if this would be a waste of time. I'd like to append the returned value (which is calculated and cannot be joined in the buffer) to the data on its way out of the transformation.

View 3 Replies View Related

Redefining Columns Isn't Possible With Flat File Connection Manager Editor Properties Already Fixed??

May 29, 2006

Dear fellows,

I know that I think as sql2k programmer-dba yet but I can€™t avoid.

I€™ve got Flat File Connection Manager Editor dragged with a text file as €˜ragged right€™ format and CRLF as header row limiter. When from properties page and Columns option I€™m going to alter just a few colums I am not be able.
It seems that you must erase all of them in order to define one or two. And in the case you€™d have 50????
When I ran sql2k DTS designer did that without problems, alter columns again and again.

As far as I know it€™s a lose of flexibility, or not? Or is there any way for do that without deleting nothing else?

View 3 Replies View Related

Integration Services :: Merge Join Transformation - No Output Rows Redux

Aug 4, 2009

I am using SSIS in SQL Server Enterprise 2005.  I have two OLE DB data sources from two disparate databases (IBM DB2 and Microsoft SQL Server), some columns from each of which are to be included in the merged output results.  I have noted the various requirements in the forum postings with regard to sorting the OLE DB sources and specifying the output source columns as being sorted, as well as the requirement that the join fields in the two sources be close/exact matches.  Yet, when I run this in VS, while the work area reflects the expected number of rows being input into the Merge Join transformation, no count is reflected as output from that transformation into the final destination table.Specifically, my two data sources (IBM DB2 and MS SQL) are configured as follows:

IBM DB2 contains an SQL statement that uses Cast operations to create the result columns.and an ORDER BY clause to ensure that the output is sorted by the desired two columns..  The OLE DB source property setting for IsSorted is set to true; the Output Columns folder column definitions for "key_ source_dtsy" and "key_source_dtrt" have their SortKeyPosition properties set to 1 and 2, respectively.  Those field are both defined as data type DT_STR, with lengths of 4 and 2, respectively.  Below is the Path metadata from the Data Flow Path editor from the path from this source:

IBM DB2 source"Name" "Data Type" "Precision" "Scale" "Length" "Code Page" "Sort Key Position" "Comparison Flags" "Source
Component""ID_CODE" "DT_STR" "0" "0" "10" "1252" "0" "" "Source F0005 User Defined Codes""CODE_DESCR_1" "DT_STR" "0" "0" "30" "1252" "0" "" "Source F0005 User Defined Codes""CODE_DESCR_2" "DT_STR" "0" "0" "30" "1252" "0" "" "Source F0005 User Defined Codes""key_source_dtsy" "DT_STR" "0" "0" "4" "1252" "1" "" "Source F0005 User Defined Codes""key_source_dtrt" "DT_STR" "0" "0" "2" "1252" "2" "" "Source F0005

User Defined Codes:

MS SQL contains an SQL statement that takes the columns as they are in the MS SQL table (no Cast operations needed); it also uses an ORDER BY clause to ensure the output is sorted by the join columns.  The OLE DB source property setting for IsSorted is set to true; the Output Columns folder columns for "key_source_dtsy" and "key_source_dtrt" have their SortKeyPosition properties set to 1 and 2, respectively.  Those field are both defined as data type DT_STR, with lengths of 4 and 2, respectively.  Below is the Path metadata from the Data Flow Path editor from the path from this source:

MS SQL source"Name" "Data Type" "Precision" "Scale" "Length" "Code Page" "Sort Key Position" "Comparison Flags" "Source Component""id_code_name" "DT_I2" "0" "0" "0" "0" "0" "" "Source CodeName in db dwVdFY""key_source_dtsy" "DT_STR" "0" "0" "4" "1252" "1" "" "Source CodeName in db dwVdFY""key_source_dtrt" "DT_STR" "0" "0" "2" "1252" "2" "" "Source CodeName in db dwVdFY"

The Merge Join transformation specifies an INNER JOIN using the columns named "key_source_dtsy" and "key_source_dtrt" from the respective data sources.I know there are alternative ways of accomplishing my intent (Lookup, port MS SQL table to IBM DB2 so join can occur in SELECT statement, etc.; however, I'd like to use this functionality and assume that it should work. 

View 13 Replies View Related

Simple Custom Data Flow Transformation Doesn't Produce Any Output

Aug 30, 2006

I've built a simple custom data flow transformation component following the Hands On Lab (http://www.microsoft.com/downloads/details.aspx?familyid=1C2A7DD2-3EC3-4641-9407-A5A337BEA7D3&displaylang=en) and the Books Online (ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/adc70cc5-f79c-4bb6-8387-f0f2cdfaad11.htm and ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/b694d21f-9919-402d-9192-666c6449b0b7.htm).

All it is supposed to do is create an output column and set its value to the result of calling a web service method (the transformation is synchronous). Everything seems fine, but when I run the data flow task that contains it, it doesn't generate any output. The Visual Studio debugger displays it as yellow, with 1,385 rows going into it, but the data viewer attached to its output is empty. The output metadata looks just like I expect: all of my input columns plus the new column, correctly typed. No validation or run-time warnings or errors are reported.

I'll include the entire C# file below, which only overrrides the ProvideComponentProperties, Validate, PreExecute, ProcessInput, and PostExecute methods of the parent PipelineComponent class.

Since this is effectively a specialization of the DerivedColumn transformation, could I inherit from the class that implements the DC component instead of PipelineComponent? How do I even find out what that class is?

Thanks! Here's the code:
using System;
// using System.Collections.Generic;
// using System.Text;

using Microsoft.SqlServer.Dts.Pipeline;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;

namespace CustomComponents
{
[DtsPipelineComponent(DisplayName = "GID", ComponentType = ComponentType.Transform)]
public class GidComponent : PipelineComponent
{
///
/// Column indexes for faster processing.
///
private int[] inputColumnBufferIndex;
private int outputColumnBufferIndex;

///
/// The GID web service.
///
private GID.WS_PDF.PDFProcessService gidService = null;

///
/// Called to initialize/reset the component.
///
public override void ProvideComponentProperties()
{
base.ProvideComponentProperties();
// Remove any existing metadata:
base.RemoveAllInputsOutputsAndCustomProperties();
// Create the input and the output:
IDTSInput90 input = this.ComponentMetaData.InputCollection.New();
input.Name = "Input";
IDTSOutput90 output = this.ComponentMetaData.OutputCollection.New();
output.Name = "Output";
// The output is synchronous with the input:
output.SynchronousInputID = input.ID;
// Create the GID output column (16-character Unicode string):
IDTSOutputColumn90 outputColumn = output.OutputColumnCollection.New();
outputColumn.Name = "GID";
outputColumn.SetDataTypeProperties(Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_WSTR, 16, 0, 0, 0);
}

///
/// Only 1 input and 1 output with 1 column is supported.
///
///
public override DTSValidationStatus Validate()
{
bool cancel = false;
DTSValidationStatus status = base.Validate();
if (status == DTSValidationStatus.VS_ISVALID)
{
// The input and output are created above and should be exactly as specified
// (unless someone manually edited the persisted XML):
if (ComponentMetaData.InputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component accepts 1 Input.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component provides 1 Output.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection[0].OutputColumnCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output must be 1 column.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
// And the output column should be a Unicode string:
else if ((ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].DataType != DataType.DT_WSTR) ||
(ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].Length != 16))
{
ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output column data type must be (DT_WSTR, 16).",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISBROKEN;
}
}
return status;
}

///
/// Called before executing, to cache the buffer column indexes.
///
public override void PreExecute()
{
base.PreExecute();
// Get the index of each input column in the buffer:
IDTSInput90 input = ComponentMetaData.InputCollection[0];
inputColumnBufferIndex = new int[input.InputColumnCollection.Count];
for (int col = 0; col < input.InputColumnCollection.Count; col++)
{
inputColumnBufferIndex[col] = BufferManager.FindColumnByLineageID(input.Buffer, input.InputColumnCollection[col].LineageID);
}
// Get the index of the output column in the buffer:
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];
outputColumnBufferIndex = BufferManager.FindColumnByLineageID(input.Buffer, output.OutputColumnCollection[0].LineageID);
// Get the GID web service:
gidService = new GID.WS_PDF.PDFProcessService();
}

///
/// Called to process the buffer:
/// Get a new GID and save it in the output column.
///
///
///
public override void ProcessInput(int inputID, PipelineBuffer buffer)
{
if (! buffer.EndOfRowset)
{
try
{
while (buffer.NextRow())
{
// Set the output column value to a new GID:
buffer.SetString(outputColumnBufferIndex, gidService.getGID());
}
}
catch (System.Exception ex)
{
bool cancel = false;
ComponentMetaData.FireError(0, ComponentMetaData.Name, ex.Message, string.Empty, 0, out cancel);
throw new Exception("Could not process input buffer.");
}
}
}

///
/// Called after executing, to clean up.
///
public override void PostExecute()
{
base.PostExecute();
// Resign from the GID service:
gidService = null;
}
}
}

View 1 Replies View Related

SSIS Script Transformation: Loop Through Columns In A Row

Mar 17, 2008


HI,


How do I loop through all columns in a row using a script
transformation? For example if I want trim all columns.


If I want to trim one column this is a simple script:



Public Class ScriptMain
Inherits UserComponent


Public Overrides Sub MyAddressInput_ProcessInputRow(ByVal Row As
MyAddressInputBuffer)


Row.City = Trim(Row.City)


End Sub


End Class



But what if I want to do that for all columns? I don't want to name
them all like this:



Public Class ScriptMain
Inherits UserComponent


Public Overrides Sub MyAddressInput_ProcessInputRow(ByVal Row As
MyAddressInputBuffer)


Row.Column1 = Trim(Row.Column1)
Row.Column2 = Trim(Row.Column2)
Row.Column3 = Trim(Row.Column3)
...
...
Row.Column997 = Trim(Row.Column997)
Row.Column998 = Trim(Row.Column998)
Row.Column999 = Trim(Row.Column999)


End Sub


End Class



Is there a simple foreach column in Row.columns option?


-- Joost (Atos Origin)

View 11 Replies View Related

Retain Input Columns Through An Asynchronous Transformation?

Jan 23, 2008



Is there by chance a cunning way to make the input columns automatically populate the output of an asynchronous script transformation?

My transformation writes several rows for each input row read. I'm creating some new columns along the way but I'd like all of the input columns to get output each time also. However I can't see any obvious way to achieve this, short of manually defining each column to the output and populating it in the script.

View 3 Replies View Related

What Is The SSIS Solution To Matching Columns When Using The Lookup Transformation

Jan 9, 2008

How would you do the following in SSIS?

SELECT a.TestID,
a.TestCode
FROM TableA a
WHERE UPPER(RTRIM(a.TestCode)) IN SELECT (SELECT UPPER(RTRIM(b.TestCode)) FROM TableB b)

Of course the above query is missing a few things but with ETL the where clause UPPER(RTRIM does not appear to be something that has an object or property that I can use in the Lookup.

Please correct and educate me.

View 4 Replies View Related

SSIS Lookup Transformation To Update Individual Columns

Mar 4, 2008

Hi,
I have an example situation that seems like it should have a super easy solution, but my jobs keep failing.
Here we go. . .

I have a SQL Server 2005 table as my source in a data flow task.
This table contains raw data.
We'll call it FACT_Product_Raw - which contains a field called ProductType varchar(1)
Let's say that ProductType contains values of "A" or "B" or "C" - or for that matter, some null and garbage values

I have a lookup table, LOV_Product_Types
This table contains 3 fields that will transform my raw data table
We'll call these fields ProdTypeID smallint, ProdTypeRaw varchar(1) and ProdType smallint
It contains pairs such that A = 1, B = 2, and so on.


Here's what I want to do.
I want to ADD a field to FACT_Product_Raw that contains the "looked up" value from LOV_Product_Types.
Let's say that I want to add the ProdTypeID field to my _Raw table.

I have used the _Raw table as both my source and destination
It blows up every time.
Help.
Thanks,
David

View 5 Replies View Related

DT_NTEXT Pass Through Columns In Fuzzy Lookup Transformation

Sep 4, 2006

The documentation on the fuzzy lookup transform mentions that only columns of type DT_WSTR and DT_STR can be used in fuzzy matching. I interpreted this as meaning that you could not create a mapping between an input column of type DT_NTEXT and a column from the reference table. I assumed that you could still have a DT_NTEXT column as part of the input and mark this as a pass through column so that it's value could be inserted in the destination, together with the result of the lookup operation. Apparently this is not the case. Validation fails with the following message: 'The data type of column 'fieldname' is not supported.' First, I'd like to confirm that this is really the case and that I have not misinterpreted this limitation.

Finally, given the following situation

- A data source with input columns

Field_A DT_STR
Field_B DT_NTEXT

- A fuzzy lookup is used to match Field_A to a row in the reference table and obtain Field_C.

- Finally, Field_B and Field_C must be inserted into the destination.

Can anyone suggest how this could be achieved?

Fernando Tubio



View 5 Replies View Related

Audit Transformation Uses CURRENT Lengths For User Name And Machine Name Columns

Dec 2, 2007

I'll try to reproduce this later, but want to report it before I forget.

I just had my package fail on a VM I was testing on. It failed because on that machine, I logged in as MachineNameAdministrator instead of using my domain account (the VM is not in the domain).

This was a problem because the "User Name" column generated by the Audit Transformation was 17 characters long! This is the length of my domain + user name on my development machine. Similarly, the machine name length was 15 characters.

I'd love to know what the "correct" sizes are for these columns. In the meantime, I'm going to set these to 255 manually, and hope the size sticks.




P.S. There was one other post on this topic, though the thread isn't clear that this was the problem: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=472445&SiteID=1.

View 1 Replies View Related

Integration Services :: DQS Cleansing Transformation - Can't Deselect Input Columns

Apr 14, 2015

In SSIS I use the DQS Cleansing transformation component. I've got a knowledge base (KB) in place and this KB holds various domains and my data source has more input columns than would like to use for a particular clean up operation. I want to use some of the input columns to map against some domains in the KB. It is my understanding that it should be possible to select only the required input columns, but all i can do is select all input columns.

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved