Tips On Creating Output Columns In A Custom Transformation

Aug 14, 2007

I would like my transformation to automatically create an output column for each input column. Any tips? I can't seem to determine which event to listen to or method to override.

View 3 Replies


ADVERTISEMENT

Creating A Custom Transformation Component Walkthrough

Apr 10, 2006

Microsoft published a "Creating a custom transformation component Walkthrough" published on

http://www.microsoft.com/downloads/details.aspx?FamilyID=1c2a7dd2-3ec3-4641-9407-a5a337bea7d3&DisplayLang=en

Does anyone know where to get the Hands-On Lab Files mentioned?

Thanks

Alex

View 4 Replies View Related

Simple Custom Data Flow Transformation Doesn't Produce Any Output

Aug 30, 2006

I've built a simple custom data flow transformation component following the Hands On Lab (http://www.microsoft.com/downloads/details.aspx?familyid=1C2A7DD2-3EC3-4641-9407-A5A337BEA7D3&displaylang=en) and the Books Online (ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/adc70cc5-f79c-4bb6-8387-f0f2cdfaad11.htm and ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/b694d21f-9919-402d-9192-666c6449b0b7.htm).

All it is supposed to do is create an output column and set its value to the result of calling a web service method (the transformation is synchronous). Everything seems fine, but when I run the data flow task that contains it, it doesn't generate any output. The Visual Studio debugger displays it as yellow, with 1,385 rows going into it, but the data viewer attached to its output is empty. The output metadata looks just like I expect: all of my input columns plus the new column, correctly typed. No validation or run-time warnings or errors are reported.

I'll include the entire C# file below, which only overrrides the ProvideComponentProperties, Validate, PreExecute, ProcessInput, and PostExecute methods of the parent PipelineComponent class.

Since this is effectively a specialization of the DerivedColumn transformation, could I inherit from the class that implements the DC component instead of PipelineComponent? How do I even find out what that class is?

Thanks! Here's the code:
using System;
// using System.Collections.Generic;
// using System.Text;

using Microsoft.SqlServer.Dts.Pipeline;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;

namespace CustomComponents
{
[DtsPipelineComponent(DisplayName = "GID", ComponentType = ComponentType.Transform)]
public class GidComponent : PipelineComponent
{
///
/// Column indexes for faster processing.
///
private int[] inputColumnBufferIndex;
private int outputColumnBufferIndex;

///
/// The GID web service.
///
private GID.WS_PDF.PDFProcessService gidService = null;

///
/// Called to initialize/reset the component.
///
public override void ProvideComponentProperties()
{
base.ProvideComponentProperties();
// Remove any existing metadata:
base.RemoveAllInputsOutputsAndCustomProperties();
// Create the input and the output:
IDTSInput90 input = this.ComponentMetaData.InputCollection.New();
input.Name = "Input";
IDTSOutput90 output = this.ComponentMetaData.OutputCollection.New();
output.Name = "Output";
// The output is synchronous with the input:
output.SynchronousInputID = input.ID;
// Create the GID output column (16-character Unicode string):
IDTSOutputColumn90 outputColumn = output.OutputColumnCollection.New();
outputColumn.Name = "GID";
outputColumn.SetDataTypeProperties(Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_WSTR, 16, 0, 0, 0);
}

///
/// Only 1 input and 1 output with 1 column is supported.
///
///
public override DTSValidationStatus Validate()
{
bool cancel = false;
DTSValidationStatus status = base.Validate();
if (status == DTSValidationStatus.VS_ISVALID)
{
// The input and output are created above and should be exactly as specified
// (unless someone manually edited the persisted XML):
if (ComponentMetaData.InputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component accepts 1 Input.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component provides 1 Output.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection[0].OutputColumnCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output must be 1 column.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
// And the output column should be a Unicode string:
else if ((ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].DataType != DataType.DT_WSTR) ||
(ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].Length != 16))
{
ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output column data type must be (DT_WSTR, 16).",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISBROKEN;
}
}
return status;
}

///
/// Called before executing, to cache the buffer column indexes.
///
public override void PreExecute()
{
base.PreExecute();
// Get the index of each input column in the buffer:
IDTSInput90 input = ComponentMetaData.InputCollection[0];
inputColumnBufferIndex = new int[input.InputColumnCollection.Count];
for (int col = 0; col < input.InputColumnCollection.Count; col++)
{
inputColumnBufferIndex[col] = BufferManager.FindColumnByLineageID(input.Buffer, input.InputColumnCollection[col].LineageID);
}
// Get the index of the output column in the buffer:
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];
outputColumnBufferIndex = BufferManager.FindColumnByLineageID(input.Buffer, output.OutputColumnCollection[0].LineageID);
// Get the GID web service:
gidService = new GID.WS_PDF.PDFProcessService();
}

///
/// Called to process the buffer:
/// Get a new GID and save it in the output column.
///
///
///
public override void ProcessInput(int inputID, PipelineBuffer buffer)
{
if (! buffer.EndOfRowset)
{
try
{
while (buffer.NextRow())
{
// Set the output column value to a new GID:
buffer.SetString(outputColumnBufferIndex, gidService.getGID());
}
}
catch (System.Exception ex)
{
bool cancel = false;
ComponentMetaData.FireError(0, ComponentMetaData.Name, ex.Message, string.Empty, 0, out cancel);
throw new Exception("Could not process input buffer.");
}
}
}

///
/// Called after executing, to clean up.
///
public override void PostExecute()
{
base.PostExecute();
// Resign from the GID service:
gidService = null;
}
}
}

View 1 Replies View Related

Can You Add Output Columns To The Script Transformation Editor On The Fly?

Jun 21, 2006

Can I add Output Columns to the Script Transformation Editor using code? I have to execute a SQL Statement to determine the number of years we have the data for for an item and then create the columns for the months in those years and populate them with the quantities. So my question is can I create output columns to the Script Transformation Editor on the fly that is as it is being executed?

Any input will be good.

Thanks,

MShah

View 3 Replies View Related

Get A List Of Output Columns On Script Transformation

Feb 13, 2007

I am using a script component to transform data. In the script component I created a bunch of fields for the output. Is there any way to loop through that list of columns? Is there code I can use in the script component to access the names, data types, data etc?

I saw a lot of informaiton on the OutputColumnCollection as part of some IDTSOuput90 thing (greek to me). As best I can guess this is for creating your own new columns, but can I see what columns are already defined via the script interface?

View 2 Replies View Related

Creating Error Output For Custom Components

Feb 14, 2006

Hi,

I have a 2 custom components - source and destination.

I want to create an error output for each, to allow the users of my component to handle errors the way they choose.

I only found a property in IDTSOuptut90 named isErrorOut - a boolean property indicating whether this output is an error output or not.

Does anyone have additional documentation / articles / code samples regarding how to really populate the rows in the error output?

Thanks

View 1 Replies View Related

When To Create Columns And Metadata For Custom Asynchronous Component Output

Apr 17, 2006

I'm having a tad bit of trouble getting output from an asynchronous component that I've written and am looking for some insight.

This component takes in a name string passed from upstream and parses the name components into standardized output fields. I'm using an asynchronous component because if the name string contains two names ("Fred & Wilma Flintstone") I'm outputting one row for Fred and one for Wilma. I've gotten it to run and with debugging have observed what appeared to me to be proper execution, but zero rows are flowing out of it.

In my ProvideComponentProperties method, I add the three fields and there associated metadata to the OutputColumnCollection. Is this method where this should occur? It's before the PrimeOutput method, so I didn't know if I should be creating the output columns in ProcessInput (i.e., after the output buffer is provided by PrimeOutput.)

In ProcessInput, I'm using AddRow for each input row and another if it contains a second name, setting the value for each index using the buffer's SetString method, to no avail. I can observe it to this point, but then don't know what's in that output buffer (if I'm using the wrong buffer index value, etc)

Thanks.

View 3 Replies View Related

Create Output Columns Based On Input In Custom Component

Aug 28, 2007



I'm trying to create a fairly simple custom transform component (because I've read that's the easiest one to create) which will take one column from a flat file source and based on the first row create the output columns.
I'm actually trying to write a component that will solve the now well known problem with parsing CSV files in SSIS. I have a lot of source files and all have many columns so a component that can read in the first line from the CSV file and create the output columns automatically will save me lots of time when migrating the old DTS packages.

I have the basic component set up but I'm stuck when trying to override the OnInputPathAttached method because I don't know how to use the inputID to get the first line from the input (the buffer).
Are there any good examples for creating output columns dynamically based on the input buffer?
Should I just give up on on the transform and create a custom source component instead?

View 5 Replies View Related

Displaying Custom Properties For Custom Transformation In Custom UI

Mar 8, 2007

Hi,

I am creating a custom transformation component, and a custom user interface for that component.

In
my custom UI, I want to show the custom properties, and allow users to
edit these properties similar to how the advanced editor shows the
properties.

I know in my UI I need to create a "Property Grid".
In
the properties of this grid, I can select the object I want to display
data for, however, the only objects that appear are the objects that I
have already created within this UI, and not the actual component
object with the custom properties.

How do I go about getting the properties for my transformation component listed in this property grid?

I am writing in C#.

View 5 Replies View Related

Number Of ROWS Of Output Of Aggregate Transformation Sometimes Doesn't Match The Output From T-SQL Query

Dec 25, 2006

While using Aggregate Transformation to group one column,the rows of output sometimes larger than the rows returned by a T-SQL statement via SSMS.

For example,the output of the Aggregate Transformation may be 960216 ,but the

'Select Count(Orderid) From ... Group By ***' T-SQL Statement returns 96018*.

I'm sure the Group By of the Aggregate Transformation is right!



But ,when I set the "keyscale" property of the transformation,the results match!

In my opinion,the "keyscale" property will jsut affects the performance of the transformaiton,but not the result of the transformation.

Thanks for your advice.

View 2 Replies View Related

Using Output From A Stored Procedure As An Output Column In The OLE DB Command Transformation

Dec 8, 2006

I am working on an OLAP modeled database.

I have a Lookup Transformation that matches the natural key of a dimension member and returns the dimension key for that member (surrogate key pipeline stuff).

I am using an OLE DB Command as the Error flow of the Lookup Transformation to insert an "Inferred Member" (new row) into a dimension table if the Lookup fails.

The OLE DB Command calls a stored procedure (dbo.InsertNewDimensionMember) that inserts the new member and returns the key of the new member (using scope_identity) as an output.

What is the syntax in the SQL Command line of the OLE DB Command Transformation to set the output of the stored procedure as an Output Column?

I know that I can 1) add a second Lookup with "Enable memory restriction" on (no caching) in the Success data flow after the OLE DB Command, 2) find the newly inserted member, and 3) Union both Lookup results together, but this is a large dimension table (several million rows) and searching for the newly inserted dimension member seems excessive, especially since I have the ID I want returned as output from the stored procedure that inserted it.

Thanks in advance for any assistance you can provide.

View 9 Replies View Related

Is There A Site Or A Document For Query Optimization Tips And TSQL Coding Tips?

May 9, 2008

Hi.

Me any my team are soon going to work on a performance critical application. My team has some experience of writing SQL, however we have not done performance oriented coding.

I am looking for a comphrehensive document which lists information for writing good SQL with performance. Please guide if there is such a document or web site.


Thanks,
Prasad

View 1 Replies View Related

Custom Component (Transformation) Questions

May 27, 2008

I've been trying to figure this out on my own for pretty much all of today, and part of last week. I've downloaded samples, searched this forum, blogs, etc. So I figured I would post, since it's the end of the day, and I'm not much further along.

I'm working on a custom transformation component, whose main function is to use SQL encryption/decryption to encrypt/decrypt data from the input columns, into the output columns. The component needs two strings, a key name and a certificate name, as well as the connection manager it should use to connect to SQL which will do the encryption/decryption.

Here's where I'm stuck:

1) How can I provide the key/certificate names via properties? What I'm expecting/looking for is a way to add these two properties at the component-level, which would show up under the "Custom Properties" section of the properties pane (currently, this only has one property, "UserComponentTypeName"). These key/certificate values will be used for all input columns.

2) How do I access the connection managers from within the component? What is the best way to go about using a connection manager from within my component to connect to SQL and perform the encryption/decryption? In a custom task, this was fairly simple, but it seems that same concept won't work on a transformation component.

3) Is there a better way to go about accomplishing this (column encryption via SQL from within SSIS)? Am I going about this all wrong?

As I said, I've searched for direction, but there seems to be next to nothing in the regards of a good reference for creating custom transformation components. I've looked at two MS samples, but can't seem to make any sense out of them.

Thanks in advance.
Jerad

View 3 Replies View Related

Custom Transformation Component Tutorial

Aug 14, 2007



Hi all,

Is there any tutorial to learn how custom transformation component works? maybe a blog, pdf or something...
Specifically, i need to learn how to generate an output column composed from 3 input columns. The problem is i dont know how to set the column value... anyone have some sample code?

Thanks!

View 6 Replies View Related

Custom DataFlow Transformation Not Showing Up In Toolbox

Nov 27, 2006

I've created a custom data flow tranformation and it isn't showing up in the Tool Box Items to be added under the Data Flow Items tab (right click on tool box, 'Choose Items...', then clicked Data Flow Items).

I have done the following:

signed the assembly,
added to GAC,
copied the dll to C:Program FilesMicrosoft SQL Server90DTSPipelineComponents.

It worked previously when I was just starting out, however now I cannot see it. What would cause it to not show up? Everything compiles fine. How would I determine how to fix it so that it shows up?

View 1 Replies View Related

Using Expressions In Custom Data Flow Transformation

Sep 18, 2006

Hi,

I'm creating a custom data flow transformation in c#.

I would like to use expressions within this component in the same way as in the derived column component: specifying the expression as a custom property of an output column, then evaluating this expression for each row of the buffer and using this evaluated expression to populate my output column values.

So I've added an custom expression on my output column, and set its expression type to CPET_NOTIFY

IDTSCustomProperty90 exp = col.CustomPropertyCollection.New();

exp.Name = "Expression";

exp.ExpressionType = DTSCustomPropertyExpressionType.CPET_NOTIFY;

But in the ProcessInput method I don't manage to get the evaluated expressions, when I use exp.Value I get my expression definition and not its evaluation.

Is there a way to get these evaluated expressions ?

Thanks,

Stéphane

View 6 Replies View Related

Custom Data Flow Transformation - Predefined Inputs

May 16, 2008

Hi,

I've created my own custom data flow transformation task (using C#) that
will parse a fullname and output the various name parts. In the
ProvideComponentProperties method, I create 5 output columns (prefix, first, middle,
last, and suffix). In the ProcessInput method, I parse the input and add the
name parts to the buffer. The bad thing is that I€™m making an assumption on
the position of the Full Name input column within the buffer.

I would like the €œuser€? to be able to map their "full name" input column to a known Full Name column so I don€™t have to make any assumptions. This is the first
SSIS task I€™ve tried to create and I haven€™t been able to find very many
examples online.

Any help is greatly appreciated!

Thank you,

Marshall

View 1 Replies View Related

Custom Data Flow Transformation Loosing CustomProperties

May 12, 2006

I created a custom transform that has a custom interface and is a wizard that uses a web service. It creates custom properties and output columns on the fly. I set the dialog result to Ok and close at the end of the steps. The transform then has the custom fields and output columns I created in the wizard. I've verified this by right clicking on the transform and going to the advanced editor. If I then immediately run the package, the custom fields don't exist in the CustomPropertiesCollection. If I close the package and reopen it, the properties now are gone. If I then go through the wizard again, thus recreating the properties, they stay and don't disappear. The quickest way to get a working transform is to add it to my data flow then save, close and reopen the package and then go through the wizard. Just saving after I add the transform does not help.

Does anyone know what might be causing this very strange problem?

View 7 Replies View Related

How To Get The Output Column In OLE DB Command Transformation

Jul 3, 2006



Hi,

I am writing a Dataflow task which will take a Particular column from the source table and i am passing the column value in the SQL command property. My SQL Command will look like this,

Select SerialNumber From SerialNumbers Where OrderID = @OrderID

If i go and check the output column in the Input and output properties tab, I am not able to see this serial number column in the output column tree,So i cant able to access this column in the next transformation component.

Please help me.

Thanks in advance.





View 13 Replies View Related

Error Output For A Destination Transformation

Jun 16, 2006

I am developing a custom destination component and I have encountered a few areas where there seems to be a lack of helpful documentation and examples.

1. I have not been able to find any information on or examples of creating custom destinations with an error output. The OLE DB Destination has an error output so I investigated the input and error output properties in the advanced editor and found that the OLE DB Destination error output is synchronous with the input (its SynchronousInputID matches the input's ID) and has its ExclusionGroup value set to 1. Using this information, I modeled my error output after the OLE DB Destination.

ProvideComponentProperties:
AddErrorOutput(ERROR_OUTPUT_NAME, input.ID, 1);

ProcessInput:
int errorOutputID = -1;
int errorOutputIndex = -1;
GetErrorOutputInfo(ref errorOutputID, ref errorOutputIndex);
...
buffer.DirectErrorRow(errorOutputID, 0, errorOutputIndex);

Checking the input and error output properties in the advanced editor for my custom destination component I find the following:
Input
-----
ID: 3515

Error Output
------------
ExclusionGroup: 1
ID: 3516
IsErrorOut: True
SynchronousInputID: 3515

Shortly after I start my SSIS package and it encounters an error row, I get the following exception:
[My Destination Adapter 1 [3512]] Error: System.ArgumentException: Value does not fall within the expected range. at Microsoft.SqlServer.Dts.Pipeline.Wrapper.IDTSBuffer90.DirectErrorRow(Int32 hRow, Int32 lOutputID, Int32 lErrorCode, Int32 lErrorColumn) at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.DirectErrorRow(Int32 outputID, Int32 errorCode, Int32 errorColumn) at MyDestination.ProcessInput(Int32 inputID, PipelineBuffer buffer) at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)


2. My custom destination component is used for writing a file with a fixed schema. I followed the means by which source component examples add their output columns, but applied this to my external metadata columns. In my Validate() I check if the ExternalMetadataColumnCollection.Count == 0 and return DTSValidationStatus.VS_NEEDSNEWMETADATA; to force a call to ReinitializeMetaData(). In ReinitializeMetaData() I call a method that creates the input's external metadata columns that reflect my external data source.

This works fine except every time I add my custom destination component to a SSIS Package and go to edit the component I am greeted with a dialog box that states: "The component is not in a valid state. ... Do you want the component to fix these errors automatically?" Pressing the Yes button, I assume, makes the call to ReinitializeMetaData() and I have my external metadata columns. Where is the correct place to add the external metadata columns so the user does not have to take this extra step every time they add my component to their package?

View 5 Replies View Related

Oledb Command Transformation Output

Feb 13, 2008



Hi,

I have two tables,

Table A on Server 1 (3 ROWS)
ID Name Address
ID1 A B
ID2 X Y
ID3 M N

There is another table on a different server which looks like

Table B on Server 2
PKColumn ID Details
1 ID1 Desc1
2 ID1 Desc2
3 ID1 Desc 3
4 ID2 Desc
5 ID2 Description

As you can see the ID is the common column for these two tables,
I want to get the Query the above 2 tables and the output should be dumped into a new table on Server2.

I am using the following SSIS Package

OledbDataSource-------> OledbCommand(Select * from TableB where ID =?)

From here, how can insert the rows returned from the oledb command into another table.
Since, for each row of TableA it will return some output rows...How can I insert all these into the New Table.

Please help on configuring the output of the oledb command.

Thanks,



View 5 Replies View Related

Output Param In Oledb Transformation That Calls An Sp

Feb 17, 2008

is it true that I will not be able to use the returned value from an sp that is called on every row from an oledb command transformation? I see all kinds of complaints on the web but cant determine if this would be a waste of time. I'd like to append the returned value (which is calculated and cannot be joined in the buffer) to the data on its way out of the transformation.

View 3 Replies View Related

Programmatically Creating Transformation Script Component

Nov 20, 2006

Does anyone have any examples of programmatically creating a Transformation Script Component (or Source/Destination) in the dataflow? I have been able to create other Transforms for the dataflow like Derived Column, Sort, etc. but for some reason the Script Component doesn't seem to work the same way.

I have done it as below trying many ways to get the componentClassId including the AssemblyQualifiedname & the GUID as well. No matter, what I do, when it hits the ProvideComponentProperties, it get Exception from HRESULT: 0xC0048021

IDTSComponentMetaData90 scriptPropType = dataFlow.ComponentMetaDataCollection.New();

scriptPropType.Name = "Transform Property Type";

scriptPropType.ComponentClassID = "DTSTransform.ScriptComponent";

// have also tried scriptPropType.ComponentClassID =typeof(Microsoft.SqlServer.Dts.Pipeline.ScriptComponent).AssemblyQualifiedName;

scriptPropType.Description = "Transform Property Type";





CManagedComponentWrapper instance2 = scriptPropType.Instantiate();

instance2.ProvideComponentProperties();



Any help or examples would be greatly appreciated! Thanks!

View 24 Replies View Related

Integration Services :: Merge Join Transformation - No Output Rows Redux

Aug 4, 2009

I am using SSIS in SQL Server Enterprise 2005.  I have two OLE DB data sources from two disparate databases (IBM DB2 and Microsoft SQL Server), some columns from each of which are to be included in the merged output results.  I have noted the various requirements in the forum postings with regard to sorting the OLE DB sources and specifying the output source columns as being sorted, as well as the requirement that the join fields in the two sources be close/exact matches.  Yet, when I run this in VS, while the work area reflects the expected number of rows being input into the Merge Join transformation, no count is reflected as output from that transformation into the final destination table.Specifically, my two data sources (IBM DB2 and MS SQL) are configured as follows:

IBM DB2 contains an SQL statement that uses Cast operations to create the result columns.and an ORDER BY clause to ensure that the output is sorted by the desired two columns..  The OLE DB source property setting for IsSorted is set to true; the Output Columns folder column definitions for "key_ source_dtsy" and "key_source_dtrt" have their SortKeyPosition properties set to 1 and 2, respectively.  Those field are both defined as data type DT_STR, with lengths of 4 and 2, respectively.  Below is the Path metadata from the Data Flow Path editor from the path from this source:

IBM DB2 source"Name" "Data Type" "Precision" "Scale" "Length" "Code Page" "Sort Key Position" "Comparison Flags" "Source
Component""ID_CODE" "DT_STR" "0" "0" "10" "1252" "0" "" "Source F0005 User Defined Codes""CODE_DESCR_1" "DT_STR" "0" "0" "30" "1252" "0" "" "Source F0005 User Defined Codes""CODE_DESCR_2" "DT_STR" "0" "0" "30" "1252" "0" "" "Source F0005 User Defined Codes""key_source_dtsy" "DT_STR" "0" "0" "4" "1252" "1" "" "Source F0005 User Defined Codes""key_source_dtrt" "DT_STR" "0" "0" "2" "1252" "2" "" "Source F0005

User Defined Codes:

MS SQL contains an SQL statement that takes the columns as they are in the MS SQL table (no Cast operations needed); it also uses an ORDER BY clause to ensure the output is sorted by the join columns.  The OLE DB source property setting for IsSorted is set to true; the Output Columns folder columns for "key_source_dtsy" and "key_source_dtrt" have their SortKeyPosition properties set to 1 and 2, respectively.  Those field are both defined as data type DT_STR, with lengths of 4 and 2, respectively.  Below is the Path metadata from the Data Flow Path editor from the path from this source:

MS SQL source"Name" "Data Type" "Precision" "Scale" "Length" "Code Page" "Sort Key Position" "Comparison Flags" "Source Component""id_code_name" "DT_I2" "0" "0" "0" "0" "0" "" "Source CodeName in db dwVdFY""key_source_dtsy" "DT_STR" "0" "0" "4" "1252" "1" "" "Source CodeName in db dwVdFY""key_source_dtrt" "DT_STR" "0" "0" "2" "1252" "2" "" "Source CodeName in db dwVdFY"

The Merge Join transformation specifies an INNER JOIN using the columns named "key_source_dtsy" and "key_source_dtrt" from the respective data sources.I know there are alternative ways of accomplishing my intent (Lookup, port MS SQL table to IBM DB2 so join can occur in SELECT statement, etc.; however, I'd like to use this functionality and assume that it should work. 

View 13 Replies View Related

SQL Server 2014 :: Creating A Table With Updatable Columns And Read-only Columns

May 26, 2015

Here is My requirement, I'm not sure if this is possible. Creating table called master like col1, col2 col3, col4 , col5 ...Where Col1, col2 are updatable - this can be done easily

Col3, col4 are columns in another table but these can be just a read only ?? Is this possible ? this is possible with View but not friendly with share point CRUD...Col 5 is a computed column of col 2 and col5 ? if above step can be done then sure this can be done I guess.

View 4 Replies View Related

SSIS Script Transformation: Loop Through Columns In A Row

Mar 17, 2008


HI,


How do I loop through all columns in a row using a script
transformation? For example if I want trim all columns.


If I want to trim one column this is a simple script:



Public Class ScriptMain
Inherits UserComponent


Public Overrides Sub MyAddressInput_ProcessInputRow(ByVal Row As
MyAddressInputBuffer)


Row.City = Trim(Row.City)


End Sub


End Class



But what if I want to do that for all columns? I don't want to name
them all like this:



Public Class ScriptMain
Inherits UserComponent


Public Overrides Sub MyAddressInput_ProcessInputRow(ByVal Row As
MyAddressInputBuffer)


Row.Column1 = Trim(Row.Column1)
Row.Column2 = Trim(Row.Column2)
Row.Column3 = Trim(Row.Column3)
...
...
Row.Column997 = Trim(Row.Column997)
Row.Column998 = Trim(Row.Column998)
Row.Column999 = Trim(Row.Column999)


End Sub


End Class



Is there a simple foreach column in Row.columns option?


-- Joost (Atos Origin)

View 11 Replies View Related

Retain Input Columns Through An Asynchronous Transformation?

Jan 23, 2008



Is there by chance a cunning way to make the input columns automatically populate the output of an asynchronous script transformation?

My transformation writes several rows for each input row read. I'm creating some new columns along the way but I'd like all of the input columns to get output each time also. However I can't see any obvious way to achieve this, short of manually defining each column to the output and populating it in the script.

View 3 Replies View Related

Error OutPut In Custom Source Component

May 11, 2006

For the Custome source Component ErrorOutput, should I go for asynchronous / synchronous Output.

If i go for synchronous output

// Create the error output.
IDTSOutput90 errorOutput = ComponentMetaData.OutputCollection.New();
errorOutput.IsErrorOut = true;
errorOutput.Name = "ErrorOutput";
errorOutput.SynchronousInputID = What Id is required here;
errorOutput.ExclusionGroup = 1;


Is it the IDTSOutput90 InPut.ID / OutPut.ID which should be assigned.

Thanks Regards

Anil

View 5 Replies View Related

What Is The SSIS Solution To Matching Columns When Using The Lookup Transformation

Jan 9, 2008

How would you do the following in SSIS?

SELECT a.TestID,
a.TestCode
FROM TableA a
WHERE UPPER(RTRIM(a.TestCode)) IN SELECT (SELECT UPPER(RTRIM(b.TestCode)) FROM TableB b)

Of course the above query is missing a few things but with ETL the where clause UPPER(RTRIM does not appear to be something that has an object or property that I can use in the Lookup.

Please correct and educate me.

View 4 Replies View Related

SSIS Lookup Transformation To Update Individual Columns

Mar 4, 2008

Hi,
I have an example situation that seems like it should have a super easy solution, but my jobs keep failing.
Here we go. . .

I have a SQL Server 2005 table as my source in a data flow task.
This table contains raw data.
We'll call it FACT_Product_Raw - which contains a field called ProductType varchar(1)
Let's say that ProductType contains values of "A" or "B" or "C" - or for that matter, some null and garbage values

I have a lookup table, LOV_Product_Types
This table contains 3 fields that will transform my raw data table
We'll call these fields ProdTypeID smallint, ProdTypeRaw varchar(1) and ProdType smallint
It contains pairs such that A = 1, B = 2, and so on.


Here's what I want to do.
I want to ADD a field to FACT_Product_Raw that contains the "looked up" value from LOV_Product_Types.
Let's say that I want to add the ProdTypeID field to my _Raw table.

I have used the _Raw table as both my source and destination
It blows up every time.
Help.
Thanks,
David

View 5 Replies View Related

DT_NTEXT Pass Through Columns In Fuzzy Lookup Transformation

Sep 4, 2006

The documentation on the fuzzy lookup transform mentions that only columns of type DT_WSTR and DT_STR can be used in fuzzy matching. I interpreted this as meaning that you could not create a mapping between an input column of type DT_NTEXT and a column from the reference table. I assumed that you could still have a DT_NTEXT column as part of the input and mark this as a pass through column so that it's value could be inserted in the destination, together with the result of the lookup operation. Apparently this is not the case. Validation fails with the following message: 'The data type of column 'fieldname' is not supported.' First, I'd like to confirm that this is really the case and that I have not misinterpreted this limitation.

Finally, given the following situation

- A data source with input columns

Field_A DT_STR
Field_B DT_NTEXT

- A fuzzy lookup is used to match Field_A to a row in the reference table and obtain Field_C.

- Finally, Field_B and Field_C must be inserted into the destination.

Can anyone suggest how this could be achieved?

Fernando Tubio



View 5 Replies View Related

Adding Error Output To Custom Source Component

Dec 6, 2007

Hi all,
I saw a couple of other posts here on this topic, but none quite got to my issue.
I'm trying to add error output to a custom source component (not a script, a custom component). The samples all seem to deal with transform components, and my issues seem to be unique to source components.

I have the following code related to error handling ...

Public Overloads Overrides Sub ProvideComponentProperties()

...

Dim output As IDTSOutput90 = ComponentMetaData.OutputCollection.New

output.Name = OUTPUTCOLUMNNAME


output.ExternalMetadataColumnCollection.IsUsed = True

ComponentMetaData.UsesDispositions = True

output.ErrorRowDisposition = DTSRowDisposition.RD_NotUsed

output.ErrorOrTruncationOperation = "Something got truncated or blew up"

Dim errorOutput As IDTSOutput90 = ComponentMetaData.OutputCollection.New

errorOutput.Name = ERRORCOLUMNNAME

errorOutput.IsErrorOut = True

...
End Sub


Public Overloads Overrides Sub ReinitializeMetaData()


Dim output As IDTSOutput90 = ComponentMetaData.OutputCollection(OUTPUTCOLUMNNAME)

Dim outColumn As IDTSOutputColumn90 = output.OutputColumnCollection.New


outColumn.Name = strName

outColumn.SetDataTypeProperties(DataType.DT_I4, 0, 0, 0, 0)



Dim metadataColumn As IDTSExternalMetadataColumn90 = output.ExternalMetadataColumnCollection.New


metadataColumn.Name = outColumn.Name

metadataColumn.DataType = outColumn.DataType

metadataColumn.Precision = outColumn.Precision

metadataColumn.Length = outColumn.Length

metadataColumn.Scale = outColumn.Scale

metadataColumn.CodePage = outColumn.CodePage

outColumn.ExternalMetadataColumnID = metadataColumn.ID


outColumn.ErrorRowDisposition = DTSRowDisposition.RD_NotUsed

outColumn.ErrorOrTruncationOperation = "Something Truncated!"

outColumn.TruncationRowDisposition = DTSRowDisposition.RD_NotUsed

Dim errorOutput As IDTSOutput90 = ComponentMetaData.OutputCollection(ERRORCOLUMNNAME)

Dim errorColumn As IDTSOutputColumn90 = errorOutput.OutputColumnCollection.New

errorColumn.Name = outColumn.Name

errorColumn.SetDataTypeProperties(DataType.DT_I4, 0, 0, 0, 0)
...
End Sub

The confusions I have are:
a) the stock advanced properties editor (I haven't provided a custom one) doesn't seem to "realize" that I have an error output, so provides no method to configure. I am believing it would need to know which Output columns can have their error/truncation redirected. I'd have thought setting ErrorRowDisposition on my output column would have been enough to trigger this ??
b) since I don't have any means of configuring, not surprisingly, when I try to connect my error output, designer complains that, "Ths error output cannot receive any error rows. This occurs for several reasons: Input columns or output columns are not yet defined. Error handling is not supported by the component. Error handling is not configured for the component."
c) UsesDispoistions would seem to be appropriate only for a transform component

Thanks for reading, and appreciate any help or pointers.
Bill

View 5 Replies View Related

SSIS Custom Component, Output Buffer Problem

Mar 27, 2007

Hi Guys,

I created a SSIS custom component, transformation (Asynchronous) with one Input collection and 2 output collections.

The SSIS Package which includes the Component I created works well in the Business Intelligence Studio, but when the same Package is run in the 'Execute Package utility' It fails to run. ( when you Double click on the dtsx file)

The cause of the failiure is

public override void PrimeOutput(int outputs, int[] outputIDs, PipelineBuffer[] buffers)

method receives only one output buffer when executed using the 'Execute Package Utility' { outputs = 1 , buffer.Length = 1 } ( when executed in the BI studio, the method receives parameters of both the output buffers that I expect { outputs = 2 , buffer.Length = 2 } )

The property ComponentMetaData.OutputCollection.Count = 2 as well. Yet the PrimeOutput method provides only 1 buffer.

The Validation Succeeds on both instances, which I assume means that Meta Data is Provided Properly.


What would be the reason for the same pakage to run in 2 different ways like this,

What might I have missed out to do, to make the package run in different ways on 'Business Intelligence Studio' and 'Execute Package Utility'

Thanks a lot



Below are some of the lines from the ProvideComonentProperties Method which deals with the output Collection, Isn't this sufficient for the PrimeOutput to provide 2 output buffers?





ProvideComponentProperties()









public override void ProvideComponentProperties()
{

RemoveAllInputsOutputsAndCustomProperties();
base.RemoveAllInputsOutputsAndCustomProperties();
base.ProvideComponentProperties();

//other function calls

IDTSOutput90 output1 = ComponentMetaData.OutputCollection[0];
output1.Name = "Output1";
output1.Description = ".......................";
extracted.SynchronousInputID =0;


IDTSOutput90 output2 = ComponentMetaData.OutputCollection.New();
output2.Name = "Output2";
output2.Description = "..........................";
output2.SynchronousInputID = 0;

//other function calls
}

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved