Transformation Components And Moving Data

Apr 10, 2007

Hi,

I am having a simple difficulty regarding my transformation component that I have created.

I followed all of the relevant documentation to create the component, UI and such, and when I run a program that transfers data through my component (without it doing anything to the data) to a flat file source, all I get in the flat file is ,,,,,,,,,,

Meaning that the data is not actually being passed through my component to the destination, but it does recongize the correct number of columns. I get this warning for each column: [DTS.Pipeline] Warning: The output column "PurchaseOrderDetailID" (20) on output "OLE DB Source Output" (11) and component "Source - PurchaseOrderDetail" (1) is not subsequently used in the Data Flow task. Removing this unused output column can increase Data Flow task performance.

To fix this issue, all I need to do is "check" the boxes in the advanced editor for my component. However, I want to be able have these boxes "checked" automatically.

My question is, when you "check" a box next to a column name in the advanced editor, what does that exactly do that allows you to transfer the data? What do I need to program in order for it to replicate that? I actually want it to happen automatically, so by default, all data in all columns are transfered through my component out the other side, so I just need to know how to do this by code, and not how to replicate the UI.

My component, I thought, did that automatically, as I specified everything that I thought was required based on all of the documentation I read. Obviously, I am missing something. Here are the methods that I believe would all be involved.

Please let me know what I am missing.

int[] inputColumnBufferIndexes; // ...
int[] outputColumnBufferIndexes; // ... used in PreExecute
PipelineBuffer outputBuffer; // used in ProcesInput

public override void OnInputPathAttached(int inputID)
{
IDTSInput90 input = ComponentMetaData.InputCollection.GetObjectByID(inputID);
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];
IDTSVirtualInput90 vInput = input.GetVirtualInput();

foreach (IDTSVirtualInputColumn90 vCol in vInput.VirtualInputColumnCollection)
{
IDTSOutputColumn90 outCol = output.OutputColumnCollection.New();
outCol.Name = vCol.Name;

outCol.SetDataTypeProperties(vCol.DataType, vCol.Length, vCol.Precision, vCol.Scale, vCol.CodePage);
}
}

public override void PreExecute()
{
//base.PreExecute();

IDTSInput90 input = ComponentMetaData.InputCollection[0];
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];

inputColumnBufferIndexes = new int[input.InputColumnCollection.Count];
outputColumnBufferIndexes = new int[output.OutputColumnCollection.Count];

for (int x = 0; x < input.InputColumnCollection.Count; x++)
{
IDTSInputColumn90 column = input.InputColumnCollection[x];
inputColumnBufferIndexes[x] = BufferManager.FindColumnByLineageID(input.Buffer, column.LineageID);
}

for (int x = 0; x < output.OutputColumnCollection.Count; x++)
{
IDTSOutputColumn90 column = output.OutputColumnCollection[x];
outputColumnBufferIndexes[x] = BufferManager.FindColumnByLineageID(output.Buffer, column.LineageID);
}
}


public override void PrimeOutput(int outputs, int[] outputIDs, PipelineBuffer[] buffers)
{
//base.PrimeOutput(outputs, outputIDs, buffers);
if (buffers.Length != 0)
{
outputBuffer = buffers[0];
}
}

public override void ProcessInput(int inputID, PipelineBuffer buffer)
{
//base.ProcessInput(inputID, buffer);

if (!buffer.EndOfRowset)
{
IDTSInput90 input = ComponentMetaData.InputCollection.GetObjectByID(inputID);
while (buffer.NextRow())
{
// TODO: Examine the columns in the current row.
// Add a row to the output buffer.
outputBuffer.AddRow();
for (int x = 0; x < inputColumnBufferIndexes.Length; x++)
{
// Copy the data from the input buffer column to the output buffer column.
outputBuffer[outputColumnBufferIndexes[x]] = buffer[inputColumnBufferIndexes[x]];
}
}
}
else
{
// EndOfRowset on the input buffer is true.
// Set EndOfRowset on the output buffer.
outputBuffer.SetEndOfRowset();
}
}

View 6 Replies


ADVERTISEMENT

Developing Custom Components By Extending The SSIS Stock Data Flow Components

Sep 7, 2006

Everything I've read says that custom data flow components are built by inheriting from the Microsoft.SqlServer.Dts.Pipeline.PipelineComponent class.

But the stock components such as the Derived Column data flow transformation must each be implemented by their own class. So how do I base my custom components on those classes? The documentation for the PipelineComponent class doesn't list any such subclasses.

View 1 Replies View Related

Using Composition To Create New Specialised Components From Multiple Sub-components?

Jun 30, 2006

Hi,

In another thread Jamie Thomson very informatively said "The components in SSIS
are deliberately atomic (i.e. they do something very specific) so that
its easy to put them together to build something greater than the sum
of the parts". Which does make a lot of sense. However, I've been finding that I end up having to create exactly the same "pattern" of combined transform components again and again in order to solve the same problem but in different dataflows (or even within the same dataflow). Cut-and-paste-tastic! In order to obtain real re-use, it seems to me like SSIS is crying out for an easy way to create new components by using composition - i.e. the ability to take commonly-used combinations of existing components and create new "super" components (without having to write Custom Transform Components in C#/VB.Net and handle everything in code).

Does anyone know if this sort of functionality is likely to make it into SSIS in the forseeable future?

Regards,

Lawrie

View 6 Replies View Related

Data Flow Components

Nov 1, 2006

Does anyone know a good place to download (or even purchase) custom data flow components?

Some ideas I've had (and I'm sure I'm not the first):

Transformations:

Filter
Sometimes I just want to filter out certain rows.

Unicode Converter
Convert all unicode columns in place.

Sequence Column
Adds a column with a sequence of numbers. Or perhaps even random numbers.

GUID Column
Adds a column with a GUID.

Destinations:

Garbage
Would act as a visual indicator in the designer that these rows are being discarded. You could add a data viewer right before it too.

Log File
Good for redirected error rows, writes them to various log files.

Email
Along the same lines as Log File, but sends off all the rows in a single email. Great for day to day things. Nothing too critical.

View 1 Replies View Related

Grouping Data Flow Components

Nov 15, 2007

Hello everyone,

I am developing an SSIS solution where the Data Flow task extracts data from a source csv file, then performs several transformations on the source data and then starts inserting the cleansed data into several destination tables.

The Data Flow task is getting too large!

Question:
Is there a way (best practice) for grouping components in the Data Flow - similar to the Container concept in the Control Flow?

I know this question sounds too luxurious, but I really loose the overall picture, when the Data Flow canvas gets too crowded.

Thank you in advance.
Samar

View 7 Replies View Related

Transfer Data To Excel 2007 By Using SQL Server Data Transformation Services

Jun 11, 2007

My vendor requires data to be sent in Excel format.  Some of my tables have rows over 65,536 so I need to use Excel 2007 (Max of 1,048,576).  Right now my data sits in SQL 2000.  I am using MS SQL Enterprise Manager 8.0 to prepare the data.  Is there some kind of add on or selection I am missing to use DTS to export from SQL to Excel 2007?Thanks in advance. 

View 3 Replies View Related

How To Disconnect Data Flow's Components Using SSIS API?

Sep 29, 2007

Hi,

I have a SSIS Package which I would like to modify using SSIS API. I need to put new component between some two existing data flow's components. During this process I need to disconnect two data flow's components using SSIS API. How can I do that?

Thanks,
Rafal

View 1 Replies View Related

Oracle Data Access Components (ODAC) For Windows

Mar 19, 2008


Can I create ODP.NET connection in my SSIS connection manager. I had downloaded and installed ODP.NET on my server provided by oracle. The idea is I need to test this provider and see what is the difference connecting oracle database and the data load speed.

thank you

View 4 Replies View Related

Data Conversion Components &&amp; Code Page Issue

Feb 17, 2006

I am using the SSIS wizard to pull data from DB2 z/os to sql server.  The data flow task that is created converts the data to DT_STR Ansi 1252 before storing to sql server database.  The package is blowing up on in the data conversion component...no match in found in target code page...for my city name field.

My old dts wizard didn't have this problem.  The forums seem to indicate that SSIS is no longer doing some of the implicit conversions that DTS did and I may have to do more than one conversion.

What format type/code page do I use for the other conversion?  The code page for my DB2 data source is 37.

I've tried several scenarios and none of them have worked.  Any hints?

View 4 Replies View Related

Does A Synchronous Transformation Process All Rows In A Buffer Before Outputting To Next Transformation?

Jun 5, 2006

Hi,

If you have two synchronous transformation components and the input of the second is connected to the output of the first, does the first transformation process (loop through) all rows in the buffer before outputting these rows to the second transformation? Or does the first transformation output each individual row to the second transormation as soon as it has finished processing it?

Thanks in advance,
Lawrie.

View 5 Replies View Related

Integration Services :: Difference Between Audit Transformation And Row-count Transformation?

Apr 22, 2015

tell me the difference between Audit transformation and rowcount transformation.

Because audit and rowcount transformation will provide the environment variables.

Only difference i am finding is rowcount returns the count of rows its updating .

Apart from these is there any other difference?

Tell me the scenario where i need to use the audit transformation.

View 3 Replies View Related

Integration Services :: Use CozyRoc Components In Server Data Tools

Jul 7, 2015

We've been trying to use CozyRoc components in SQL server Data Tools. It seems like way to insert Cozy Components has been changed for SSDt as we're not getting CHOOSE ITEMS option in ToolBox(Right Click on Toolbox).link or Video where SSDT was used to Cozy Components.

View 2 Replies View Related

Newbie Questions About SSIS Script Components And Data Streams

May 10, 2007

The following is a list of questions that I have not been able to obtain concrete answers. I am probably missing something:
1) ReadWriteVariables -- can the updated value for a ReadWriteVariable be accessed within the same data flow? It appears not as I think the PostExecute() fires at the completion of the data flow not the end of the Script Component. Secondarily, the Script Component is a non-blocking transformation so the component does not "see" the end of the pipeline prior to sending data down stream.

2) Record Count -- Because of #1 above, How could you calculate a record count for a data stream? It does not appear that one can calculate the number of records for a data stream within a data flow and then access the count from within the same data flow.

3) FinishOutputs() -- Is the concept of FinishOutputs() applicable to Script Component Destinations? Asked another way, is FinishOutputs() executed at the end of the data stream regardless of whether there are "real" outputs for the component? I can create a "Dummy" output to create FinishOutputs() but is this ok?

4) Script Component -- It appears that the Script Component Source, Transformation or Destination are really defined based on the columns defined in "Inputs and Outputs". Can you convert an Source script component to a transformation script component by simply adding an Output?

Sorry for these basic questions but I am not getting it completely. As you can tell...

View 12 Replies View Related

Custom Data Flow Components, DllImport, And Third Party Dlls

Mar 29, 2007

I've been having an issue with the integration of a third-party DLL into a custom data flow component.



The company sent me a C# project that generates a simple console application. The project includes a class that calls their DLL with DllImport. The console application runs fine.



I created a stand-alone class using the C# class they sent to expose the methods of their DLL. In my custom component, I'm referencing this class to pass data to and from their DLL.



The first method from that stand-alone class that my component encounters simply gets their installation path from the registry and does not use DllImport. That path retrieval works fine. The next method calls a function that is declared with DllImport. Each time the call fails with "System.DllNotFoundException = {"Unable to load DLL AMZip.dll': Exception from HRESULT: 0xE06D7363"}".



I've copied this DLL to countless locations (e.g., the PipelineComponents directory, the project/solution bin directory) and included these paths in all manner of path variables.



What am I missing here? Their DLL is not strong named (does this matter since I'm using DllImport?), my stand-alone class is, and of course, the custom component itself is. I appreciate the help.

View 9 Replies View Related

Reuse Existing Data Flow Components In A Custom Data Flow Component

Aug 29, 2007

Hello,

Is it possible to use existing data flow components (Merge Join, aggregation,...) in a custom data flow component?

Thanks,

Yoann

View 15 Replies View Related

Programmatically Iterating Tasks/components In The Data Flow Portion Of A Package.

Mar 6, 2007

HI All,

In several threads there has been discussion regarding adding connection managers to a package's data flow, etc. My challenge is that I have a large solution that contains many packages, and I need to change the connection manager linked to the data flow in all of the packages. When the solution was initially designed, data sources were used, and it has become a tedious maintenance issue to keep those in sync. We want to use a standard OLEDB connection manager, but adding a connection manager to each package and editing the corresponding data flow tasks in each package to use that new connection manager is a daunting task. I've coded a .Net module to access the packages, remove the old connection manager (data source) and add the new OLEDB data source. However, as I traverse the objects in the package hierarchy, when I come to the data flow object, the innerobject is not a dts object, but rather a _com object.. I can't seem to find any documentation/examples as to how to iterate the tasks within a data flow and change the connection manager. If you have any information, that would be quite helpful. If you reply with a code sample, if you would be so kind as to relate it to one of the sample packages provided with SSIS so I can run it, that would be great.

Thank you.

Steve.

View 1 Replies View Related

Data Flow Contains No Components After Package Save Operation And Reopening Solution

Jul 25, 2007

I have Data Flow task that contains 50 components.

My computer configuration: 1 GB RAM Microsoft Windows Server 2003

Periodicaly when i try to save package after making some changes Out of memory ... exceptions message box appears , and soon after this Not fatal error occurs ... message box shows . If i close solution and open it again all my 50 components disappears --instead I see clear list, and all my work losen.



Such "Not fatal errors" making hell out of job -- every time I need to change package i must add package to archive!!!

View 4 Replies View Related

Data Transformation

Sep 14, 2000

We are transferring data between AS/400 and SQL Server 7.0 using DTS. Some of these transfers may need to be very close to real time. It doesn't seem like a continuously running job is the best solution for that.

Do you know any tools or utilities that can help us to move the data?

Thank you,
Anastasia.

View 2 Replies View Related

Data Transformation

Feb 19, 2003

i have something like this:

select * from accounts

name type amount
==== ==== ======
mary saving 123.00
mary chequing 246.00
mary investment 135.00
john saving 678.00
john chequing 987.00
john investment 0.00

what should i do to present the data in the following format?

name saving cheq investment
==== ====== ==== ==========
mary 123.00 246.00 135.00
john 678.00 987.00 0.00


Thanks.

View 3 Replies View Related

Data Transformation

Jan 12, 2008

Hi, newbie here with a simple?(maybe)question.

I have an Access Database that I have imported into SQL Server2000 and that worked great, but now I have to get it into 2005. My question is, How can I get the tables and all info in the tables into an SQL Script so I can run that script on the 2005 server?

The SQL 2000 is on my dev server and I have all the Tools, (Ent Manager, Query Analyzer,etc...) but the 2005 Server is Godaddy's and they only have the basic web interface. I can run Sql files and create databases and tables, but thats about it.

View 2 Replies View Related

Data Transformation Services

Jul 18, 2007

HiI was told that using DTS will allow me to schedule stored procedures to keep an sql database up to date. For example if a user registers but does not activate the registration, his details will be removed by a stored procedure which is scheduled to run every 24 hours. I use to use the global.asax file to fire a update by using a file containing a the date of the last update and then by adding 24 hours to it, it would execute a SP to delete unwanted data.I have tried to install DTS with no success. I am running the followingVisual web studio expressSQL 2005 Express. (From SQLExpr_exe) and I have told it to install all the extra componentsInstalled SQLEXPR_Toolkit.exe with all its optionsInstalled SQLServer2005_DTS.MSI When I go into the sql server using MS SQL Server Management Studio Express. I cannot see the Data transformation services node. I have also just installed server reports which I had no problems installing.Can somebody please help me. 

View 2 Replies View Related

SSIS Data Transformation

Jan 31, 2008

I have begun using SSIS and I am a little taken aback by the complexity of it especially since I just want to do a simple data transformation such as in DTS.
Are there any tutorials for data transformation for SSIS on the web/this forum and what if I want to do a simple transformation from Access to SQL Server?

View 1 Replies View Related

DTS Error During Data Transformation

Nov 8, 2000

I tried transforming data from one server to another using DTS. and then i got an error as below,
-----------------------------------------------------------------------------
Details: Error Source: Pump Data Step
Details: The Data Transformation Services cannot copy or transform data from a Desktop or server to a standard, Enterprise, or small business server version of SQL server unless your destination server is per user licensing mode.

Facility: 4, Severity 1,Code: 1176, HRESULT:0x80040498

Desc: The Data Transformation Services cannot copy or transform data from a Desktop or server to a standard, Enterprise, or small business server version of SQL server unless your destination server is per user licensing mode.

Source : Microsoft Data Transformation Services(DTS) Package

Source : Microsoft Data Transformation Services(DTS) Package
Code: 0x80040428
Description: Package failed because step'Pump data step' failed.
Error Message: IDispatch error #552
--------------------------------------------------------------------------

This is the full description of the error dialog...

Please suggest me some solutions....

Thanks in advance,
Kamalesh D

View 2 Replies View Related

Data Transformation - Hanging

Jul 20, 2005

I'm running a DTS package on SQL Server. The source is MS Access and thetarget is Oracle.On a "Drop Table" command the process just hangs. There are no foreign keys onthe table. Several tables have already been processed successfully by thistime.I think I've ruled out corruption by dropping and recreating the targetdatabase on Oracle.Any ideas?M Man

View 1 Replies View Related

Lookup Transformation (Can It Be Using The Old Data?)

Mar 31, 2008



I have a lookup transformation that retrieves a key for a certain column of values, in this case, a name. So, I go in to the lookup table with a name and come out with its key. I had it working and then I added new entries to the lookup table for a bunch of new names. Now, for some reason, I am not getting the matches for the new names. But I am still getting the matches for the names that existed before I added the new ones.

I'm wondering if the lookup transformation is using the old set of data and some how not picking up the new names. Do I have to trigger something in the lookup transformation to let it know that the lookup table data has changed?

View 4 Replies View Related

Data Transformation Question

Sep 6, 2006

I have a student table that needs some clean up. My first task is to remove all the periods (.) from the middle name column. Some people have two char middle initials with two periods. Can this be done via a SSIS package (I sure it can, but not how)? I can't think of a simple update statement that would accomplish the same thing.

Any direction would be appreciated.

View 3 Replies View Related

What Does Strategy Exist To Deploy SSIS Package And My Own Data Flow Components Into A Enterparise Server?

Mar 29, 2007



I created a SSIS package and several data flow componenets for this package.



What does strategy exist to deploy SSIS package and data flow components into a enterparise server?



Thanks in advance.

View 2 Replies View Related

SSIS Package Hangs In Data Flow, Magically Works After Opening And Closing Components

Nov 2, 2006

We're experiencing a problem where intermittently our SSIS packages will hang. There are no log errors or events in the event viewer. It will happen whether the package is executed from the SQL Job Agent or run from BIDs. When running from BIDs it appears to hang inside one of the data flows (several parallel pipes with sorts, merge joins etc...). It appears to hang in multiple pipes within the data flow component. The problem is reproducable, we just kill it and re-run, and it appears to hang in the same places.

Now here's the odd thing: as we simply open and close some of the components in the pipe line after the place it hangs, a subsequent run will go further in the pipeline before hanging. If we open and close all the components after the point it initially hung, the data flow will run fine, from there on out. When I say "open and close" I mean no changes are made, we simply double-click the component, like a merge join, then click 'close.'

To me this does not seem like a memory problem but likely something is wrong with the metadata, where opening a component and closing it somehow alters the metadata to "right it".

This seems to occur intermittently after we make modifications to the package. It's like if you make any mod, even unrelated to the data flow, you then have to go through and open and close every component in your package to ensure it will work. Again, no errors or warnings are fired.

Has anyone seen this type of problem?

View 10 Replies View Related

Data Transformation Services Question.

Oct 23, 2002

I am creating a DTS package for an import of data from MSAccess97 to SQLServer2000. I am quite new to the DTS so bear with me please.

Everything was fairly simple until I got stumped by the following problem:

In the source database there was a table with multiple fields (let's say A, B, C) for each record containing their values (let's say 1,2,3).

Now, in the destination database the table is built differently. It is a table containing the field data and definition.

So basically I need to turn this

ID A B C
0 1 2 3
1 4 5 6


into this

ID FieldName FieldVal
0 A 1
0 B 2
0 C 3
1 A 4
1 B 5
1 C 6


I am not sure how to go about that... any thoughts? :confused:

View 5 Replies View Related

Transformation Component Data Store

Jun 8, 2006

i am developing one custom transfer component, where i am building one custom object and want the same to be transfered from ComponentUI to component.I explored in this issue and came to know that we can make use of SaveToXML and LoadXML methods of IDTSPersist90 interface. The problem is i could not able to make use of this interface.If any body faced same issue and got the solution, let me know the same.

Thanks in advance

Karun

View 1 Replies View Related

CTE In OLE DB Command Data Flow Transformation

Dec 20, 2006

I am trying to use a CTE in an OLE DB Command data flow transformation object. However, when I enter the cte and corresponding query in the SqlCommand field of the OLE DB command editor dialog, I get a syntax error. Can CTE's be used data flow objects? I have been able to use them in an Execute SQL Control Flow Item, but not in any data flow item.

View 7 Replies View Related

Data Transformation Project Template

Aug 8, 2006

Does anyone know how to install or make available the Data Transformation Project Template in SQL Server 2005? I can not find it using Integration Services.

View 1 Replies View Related

SSIS Data Transformation Using Look Up Or Scripting???

Mar 6, 2008

Hi all,

I've got to change values in my source database as follows:

Source: Target:

X 1
Y 1
Z 2

Can I create a lookup table and us a look up task in SSIS to do this or do I need to script it?

Thanks

F

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved