Functions In A Transformation Of A Script Component In A Data Flow Task

Feb 19, 2008

Hello Helpers,

I need to know how to use my private function - created as a scalar-valued-function in SQL Server 2005 - in script component (here a transformation is used) in a data flow task to transform a two-digit-month into a tree-sign-month:

Example: '01' should be transformed into 'Jan'

Many thanks for alle your commitment and help!

Ulrike

View 4 Replies


ADVERTISEMENT

Lookup Task Data Flow Transformation Causes Data Flow Task To Hang?

Dec 28, 2007

Hi,
I'm trying to implement an incremental data pull (Oracle to SQL) based on Andy's blog:
http://sqlblog.com/blogs/andy_leonard/archive/2007/07/09/ssis-design-pattern-incremental-loads.aspx

My development machine is decent: 1.86 GHz, Intel core 2 CPU, 3 GB of RAM.
However it seems the data flow task gets hung whenever I test the package against the ~6 million row source, as can be seen from these screenshots. I have no memory limitations on the lookup transformation. After the rows have been cached nothing happens. Memory for the dtsdebug process hovers around 1.8 GB and it uses 1-6 percent of CPU resources continuously. I am not using fast load to insert new records into my sql target table. (I am right clicking Sequence Container 3 and executing this container NOT the entire package in the screenshots)

http://i248.photobucket.com/albums/gg168/boston_sql92/1.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/2.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/3.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/4.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/5.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/6.jpg


The same package works fine against a similar test table with 150k rows.
http://i248.photobucket.com/albums/gg168/boston_sql92/7.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/8.jpg

The weird thing is it only takes 24 minutes for a full refresh of the entire source table from Oracle to the SQL target table.
Any hints,advice would be appreciated.

View 18 Replies View Related

Remark Some Task Or Data Transformation Component

Feb 14, 2007

How could I remark my script component, It is isolated, I just wanna check something without losing the long script written inside.


Thanks,
Fahad

View 6 Replies View Related

The Component Could Not Be Added To The Data Flow Task

Feb 6, 2007

Dear Colleagues,

I'm trying to develop a custom Data Flow Transformization component in SSIS.

I compiled it without errors, installed it in the GAC and in the Pipeline Components-folder however I always get the following message when I'm trying to drag the component onto the designer surface:

The component could not be added to the Data Flow task.
Please verify that this component is properly installed.

------------------------------
ADDITIONAL INFORMATION:

The data flow object "RisikoKennzahlenKomponenten.MarktwertTransformation,
RisikoKennzahlenKomponenten, Version=1.0.0.0, Culture=neutral,
PublicKeyToken=cfa8722b8086ac2d" is not installed correctly on this computer.
(Microsoft.DataTransformationServices.Design)

Program Location:

at
Microsoft.DataTransformationServices.Design.DtsBasePackageDesigner.GetPipelineInfo(String creationName, IServiceProvider serviceProvider)
at
Microsoft.DataTransformationServices.Design.DesignUtils.GetNewPipelineComponentObjectName(IDTSComponentMetaDataCollection90
parentCollection, String clsid, IDTSComponentMetaData90 componentMetadata,
PipelineComponentInfo& pipelineComponentInfo)
at
Microsoft.DataTransformationServices.Design.PipelineTaskDesigner.AddNewComponent(String clsid, Boolean throwOnError)

This happens with EVERY custom component on my computer. The same components work fine on other machines.

Does anyone have an idea?

Regards

Arne Janning

View 1 Replies View Related

Is There Away To Reference Global Variables In A Lookup Transformation That Are Set Outside A Data Task Flow?

Mar 31, 2008



The logic I am trying to recreate via SSIS is the following SQL statement:

insert into db3.dbo.targettable1 -- Target database table
(SiteC,
Objecte,
Attrib1)

select distinct ?,
?,
from ? -- Source database table
join dbo.targettable2 c1 -- Target database table
on c1.Alias = ? and
c1.CSetID = ? and
c1.FacID = (select f.PFacID
from dbo.Fac f
where f.FacID = ?)
where not exists (select * from dbo.targettable2 c -- Target database table
where c.Alias = ? and
c.FacID = ? and
c.CSetID = ?)


I have an OLE DB Source that consists of an expression to approximate the following portion of the Above Select statement:

Select ?,
from ? -- Source database table and

The package has 2 global variables User:CSetID and User::FacID whose scope is global to the package and whose values are set within a Foreach Loop Container outside of the Data Flow Task

I was trying to reference the 2 global variables within the Looup Transformation to recreate the following portion of the SQL statement.but encounter errors:



join dbo.targettable2 c1 -- Target database table
on c1.Alias = ? and
c1.CSetID = ? and
c1.FacID = (select f.PFacID
from dbo.Fac f
where f.FacID = ?)


In the Advanced Editor window of Lookup Transaction

select * from
(select * from [dbo].[targettable2 ]) as refTable
where [refTable].[Alias] = ? and [refTable].[FacID] = ? and
[refTable].[CSetID] = ?



Is there away to reference global variables in a Lookup Transformation that are set outside a Data Task Flow?

View 3 Replies View Related

Programmatically Adding A Script Component To Data Flow Task

Feb 2, 2007



Dear all,

I am developing tools for automatic creation of data warehouse tables, cubes and SSIS packages. Generating the SSIS Data Flows works very well using the SSIS components for OLE DB Source, Derived Column, Lookup and OLE DB Destination.

However for some of the advanced functionality I need to use Script Component. I have managed to add it in the Data Flow with all inputs and outputs, but how do I populate it with my code? I've seen there is a component property called "SourceCode" and one called "BinaryCode". The "SourceCode" contains the code, but also some extra metadata.

Questions:

Do you know if there is any programmatic support to generate the Source Code property with the metadata necessary?

Do you know how to compile the Source Code and generate the property BinaryCode?

Example from my code below:

// Create script component

IDTSComponentMetaData90 script = dataFlowTask.ComponentMetaDataCollection.New();

script.ComponentClassID = app.PipelineComponentInfos["Script Component"].CreationName;

CManagedComponentWrapper scriptWrapper = script.Instantiate();

script.InputCollection.New();

script.OutputCollection.New();

scriptWrapper.ProvideComponentProperties();

script.Name = "Logics";

// Create path

IDTSPath90 scriptPath = dataFlowTask.PathCollection.New();

scriptPath.AttachPathAndPropagateNotifications(lastComponent.OutputCollection[0], script.InputCollection[0]);

// Populate input and output columns

IDTSInput90 scriptInput = script.InputCollection[0];

IDTSVirtualInput90 scriptVInput = scriptInput.GetVirtualInput();

foreach (IDTSOutputColumn90 col in oledbSrc.OutputCollection[0].OutputColumnCollection)

{

scriptWrapper.SetUsageType(scriptInput.ID, scriptVInput, col.LineageID, DTSUsageType.UT_READONLY);

IDTSOutputColumn90 tmp = script.OutputCollection[0].OutputColumnCollection.New();

tmp.Name = col.Name;

tmp.SetDataTypeProperties(col.DataType, col.Length, col.Precision, col.Scale, col.CodePage);

}

// Make script asynchronous

script.OutputCollection[0].SynchronousInputID = 0;

Thanks for any assistance and Best Regards,

Johan Åhlén,
Business Intelligence consultant at IFS

View 2 Replies View Related

Call FireQueryCancel() In A Script Component Within Data Flow Task?

Dec 18, 2007

I am trying to cleanly shutdown a dataflow task, which contains a script component, when RunningPackage.Stop() is called from the SSIS runtime.

I've been going in ever decreasing circles with no success - it looks like the cleanest way to find out whether RunningPackage.Stop() has been called is to call FireQueryCancel(). But I can't find any reference to anything useful in a dataflow task script component that gives me something that implements IDTSComponentEvents. The nearest thing seems to be Me.ComponentMetaData which gives a reference to IDTSComponentMetaData90, but this only has methods for calling FireError, FireInformation, FireProgress, FireWarning, and FireCustomEvent. But no FireQueryCancel.

Is there a way in a script component that I can find out the state of QueryCancel?

Any help would be apprecieated.

View 2 Replies View Related

Try Catch Doesn't Catch Errors Inside A Data Flow Transformation Script Component

Feb 15, 2007

Hi,

I'm having trouble with a Script Component in a data flow task. I have code that does a SqlCommand.ExecuteReader() call that throws an 'Object reference not set to an instance of an object' error. Thing is, the SqlCommand.ExecuteReader() call is already inside a Try..Catch block. Essentially I have two questions regarding this error:

a) Why doesn't my Catch block catch the exception?
b) I've made sure that my SqlCommand object and the SqlConnection property that it uses are properly instantiated, and the query is correct. Any ideas on why it is throwing that exception?

Hope someone could help.

View 3 Replies View Related

Reuse Existing Data Flow Components In A Custom Data Flow Component

Aug 29, 2007

Hello,

Is it possible to use existing data flow components (Merge Join, aggregation,...) in a custom data flow component?

Thanks,

Yoann

View 15 Replies View Related

CTE In OLE DB Command Data Flow Transformation

Dec 20, 2006

I am trying to use a CTE in an OLE DB Command data flow transformation object. However, when I enter the cte and corresponding query in the SqlCommand field of the OLE DB command editor dialog, I get a syntax error. Can CTE's be used data flow objects? I have been able to use them in an Execute SQL Control Flow Item, but not in any data flow item.

View 7 Replies View Related

Transformation Component Data Store

Jun 8, 2006

i am developing one custom transfer component, where i am building one custom object and want the same to be transfered from ComponentUI to component.I explored in this issue and came to know that we can make use of SaveToXML and LoadXML methods of IDTSPersist90 interface. The problem is i could not able to make use of this interface.If any body faced same issue and got the solution, let me know the same.

Thanks in advance

Karun

View 1 Replies View Related

Data Flow Transformation: Count And Sum The Column Value

Nov 8, 2007

Hi,

I have one data flow control. Source is SQL server and destination is flat file destination. I have one derived column placed in between these two. This functionality works fine. I would like to sum one column data and count total no. of columns and put it in global variable. How can I achieve it?

Thanks,

View 7 Replies View Related

Using Expressions In Custom Data Flow Transformation

Sep 18, 2006

Hi,

I'm creating a custom data flow transformation in c#.

I would like to use expressions within this component in the same way as in the derived column component: specifying the expression as a custom property of an output column, then evaluating this expression for each row of the buffer and using this evaluated expression to populate my output column values.

So I've added an custom expression on my output column, and set its expression type to CPET_NOTIFY

IDTSCustomProperty90 exp = col.CustomPropertyCollection.New();

exp.Name = "Expression";

exp.ExpressionType = DTSCustomPropertyExpressionType.CPET_NOTIFY;

But in the ProcessInput method I don't manage to get the evaluated expressions, when I use exp.Value I get my expression definition and not its evaluation.

Is there a way to get these evaluated expressions ?

Thanks,

Stéphane

View 6 Replies View Related

Custom Data Flow Transformation - Predefined Inputs

May 16, 2008

Hi,

I've created my own custom data flow transformation task (using C#) that
will parse a fullname and output the various name parts. In the
ProvideComponentProperties method, I create 5 output columns (prefix, first, middle,
last, and suffix). In the ProcessInput method, I parse the input and add the
name parts to the buffer. The bad thing is that I€™m making an assumption on
the position of the Full Name input column within the buffer.

I would like the €œuser€? to be able to map their "full name" input column to a known Full Name column so I don€™t have to make any assumptions. This is the first
SSIS task I€™ve tried to create and I haven€™t been able to find very many
examples online.

Any help is greatly appreciated!

Thank you,

Marshall

View 1 Replies View Related

Simple IF Or CASE Statement In Data Flow Transformation

May 11, 2007

Using SSIS to import from Excel to SQL Server.

In Excel they are showing Sale as -ve quantity and purchase as +ve quantity.

The database has quantity always as +ve figure and a separate column "isPurchase" set to true or false depending on whether purchase or sale.



So I need Derived Column to return a bolean (or int) depending whether quantity is positive or negative. I tried each of the following in the Expression but all of them were invalid expressions.

CASE WHEN [TotalQuantity] > 0 THEN 0 ELSE 1 END

If [TotalQuantity] > 0 THEN 1 ELSE 0 END

IIF ([TotalQuantity] > 0, 1,0)



Can anyone help me with correct syntax, or correct Data Flow Transformation if Derived column is wrong.



Thanks



Richard

View 3 Replies View Related

Custom Data Flow Transformation Loosing CustomProperties

May 12, 2006

I created a custom transform that has a custom interface and is a wizard that uses a web service. It creates custom properties and output columns on the fly. I set the dialog result to Ok and close at the end of the steps. The transform then has the custom fields and output columns I created in the wizard. I've verified this by right clicking on the transform and going to the advanced editor. If I then immediately run the package, the custom fields don't exist in the CustomPropertiesCollection. If I close the package and reopen it, the properties now are gone. If I then go through the wizard again, thus recreating the properties, they stay and don't disappear. The quickest way to get a working transform is to add it to my data flow then save, close and reopen the package and then go through the wizard. Just saving after I add the transform does not help.

Does anyone know what might be causing this very strange problem?

View 7 Replies View Related

Pass Data In Script Component When No Transformation Needed

Mar 26, 2007

Hi,
I have 56 fields coming into the input of an script component, The need for script component was to just to check if one of those 56 columns has a valid date or not, If valid it will parse and put in an output date column, if not, it will put in NULL.


The 55 fields should be passed on. I dont really wanna write code and define output columns. How do I do this ?


Any input in this would be appreciated.


Thanks,

View 5 Replies View Related

Need Help With Script Component In Data Flow

Jul 26, 2007

I have a int value that I pulled from a table in a database and it is stored in a variable. I would like to increment this value in a script component and then insert it into a field in my ole db destination. Is there any example out there of using a script component to do something similar to this.

Do I want to select source, destination or transformation?

Can I/how do I access my variable from within the script component.

Any asistance or examples you can point me towards would be greatly appreciated.

View 3 Replies View Related

Xml Ssis Data Flow Component?

Jul 2, 2007

There is a table with a column that contains Xml documents. For each record from my Data Flow Source, I want to pass in the Xml document and the node to interrogate, and return the value contained in the node. Like the Crm component, this is probably one I will have to write from scratch in C#, but I would like to avoid having to create the custom component if it already exists in the public arena.



Does anyone know of any Xml Ssis Data Flow Components that are downloadable for free?

View 3 Replies View Related

How Can I Check Variables Comes From Data Flow In Predence Constraints And Conditional Transformation?

May 1, 2008



Hi,

I have Variable , data source and conditional transformation which checks the count(*) if the count == 0 then I connect an script component and change variable to false(initial it is True) and write into a log file...

Then I check that variable on predence constarint at workflow if variable==True then success. BUT
Whenever I run the package my dataflow gets green even the condition does not meet like count==0 . So
my variable's value is "False". Actually if the condition doesnt meet then my script shouldnt work.
Am I missing something???

View 7 Replies View Related

Simple Custom Data Flow Transformation Doesn't Produce Any Output

Aug 30, 2006

I've built a simple custom data flow transformation component following the Hands On Lab (http://www.microsoft.com/downloads/details.aspx?familyid=1C2A7DD2-3EC3-4641-9407-A5A337BEA7D3&displaylang=en) and the Books Online (ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/adc70cc5-f79c-4bb6-8387-f0f2cdfaad11.htm and ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/b694d21f-9919-402d-9192-666c6449b0b7.htm).

All it is supposed to do is create an output column and set its value to the result of calling a web service method (the transformation is synchronous). Everything seems fine, but when I run the data flow task that contains it, it doesn't generate any output. The Visual Studio debugger displays it as yellow, with 1,385 rows going into it, but the data viewer attached to its output is empty. The output metadata looks just like I expect: all of my input columns plus the new column, correctly typed. No validation or run-time warnings or errors are reported.

I'll include the entire C# file below, which only overrrides the ProvideComponentProperties, Validate, PreExecute, ProcessInput, and PostExecute methods of the parent PipelineComponent class.

Since this is effectively a specialization of the DerivedColumn transformation, could I inherit from the class that implements the DC component instead of PipelineComponent? How do I even find out what that class is?

Thanks! Here's the code:
using System;
// using System.Collections.Generic;
// using System.Text;

using Microsoft.SqlServer.Dts.Pipeline;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;

namespace CustomComponents
{
[DtsPipelineComponent(DisplayName = "GID", ComponentType = ComponentType.Transform)]
public class GidComponent : PipelineComponent
{
///
/// Column indexes for faster processing.
///
private int[] inputColumnBufferIndex;
private int outputColumnBufferIndex;

///
/// The GID web service.
///
private GID.WS_PDF.PDFProcessService gidService = null;

///
/// Called to initialize/reset the component.
///
public override void ProvideComponentProperties()
{
base.ProvideComponentProperties();
// Remove any existing metadata:
base.RemoveAllInputsOutputsAndCustomProperties();
// Create the input and the output:
IDTSInput90 input = this.ComponentMetaData.InputCollection.New();
input.Name = "Input";
IDTSOutput90 output = this.ComponentMetaData.OutputCollection.New();
output.Name = "Output";
// The output is synchronous with the input:
output.SynchronousInputID = input.ID;
// Create the GID output column (16-character Unicode string):
IDTSOutputColumn90 outputColumn = output.OutputColumnCollection.New();
outputColumn.Name = "GID";
outputColumn.SetDataTypeProperties(Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_WSTR, 16, 0, 0, 0);
}

///
/// Only 1 input and 1 output with 1 column is supported.
///
///
public override DTSValidationStatus Validate()
{
bool cancel = false;
DTSValidationStatus status = base.Validate();
if (status == DTSValidationStatus.VS_ISVALID)
{
// The input and output are created above and should be exactly as specified
// (unless someone manually edited the persisted XML):
if (ComponentMetaData.InputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component accepts 1 Input.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component provides 1 Output.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection[0].OutputColumnCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output must be 1 column.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
// And the output column should be a Unicode string:
else if ((ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].DataType != DataType.DT_WSTR) ||
(ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].Length != 16))
{
ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output column data type must be (DT_WSTR, 16).",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISBROKEN;
}
}
return status;
}

///
/// Called before executing, to cache the buffer column indexes.
///
public override void PreExecute()
{
base.PreExecute();
// Get the index of each input column in the buffer:
IDTSInput90 input = ComponentMetaData.InputCollection[0];
inputColumnBufferIndex = new int[input.InputColumnCollection.Count];
for (int col = 0; col < input.InputColumnCollection.Count; col++)
{
inputColumnBufferIndex[col] = BufferManager.FindColumnByLineageID(input.Buffer, input.InputColumnCollection[col].LineageID);
}
// Get the index of the output column in the buffer:
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];
outputColumnBufferIndex = BufferManager.FindColumnByLineageID(input.Buffer, output.OutputColumnCollection[0].LineageID);
// Get the GID web service:
gidService = new GID.WS_PDF.PDFProcessService();
}

///
/// Called to process the buffer:
/// Get a new GID and save it in the output column.
///
///
///
public override void ProcessInput(int inputID, PipelineBuffer buffer)
{
if (! buffer.EndOfRowset)
{
try
{
while (buffer.NextRow())
{
// Set the output column value to a new GID:
buffer.SetString(outputColumnBufferIndex, gidService.getGID());
}
}
catch (System.Exception ex)
{
bool cancel = false;
ComponentMetaData.FireError(0, ComponentMetaData.Name, ex.Message, string.Empty, 0, out cancel);
throw new Exception("Could not process input buffer.");
}
}
}

///
/// Called after executing, to clean up.
///
public override void PostExecute()
{
base.PostExecute();
// Resign from the GID service:
gidService = null;
}
}
}

View 1 Replies View Related

Load Fact Table, Very Slow With Lookup - Data Flow Transformation

Apr 9, 2008



Hello,


I have developed some packages to load data into "Fact" tables in the data warehouse.
Some packages are OK, other ones not. What is the problem?: some packages load fact tables with lots of "Lookup - Data Flow Transformation" into the "data flow task" (lookup against dimension tables) but they are very very slow, too much slow to be choosen as a solution.

Do you have any other solutions to avoid using "Lookup - Data Flow Transformation"? Any other solution (SSIS, TSQL and so on....) is welcome to speed up the Fact table loading process.

Thank in advance

View 7 Replies View Related

Do We Have To Alawys Use Slowly Changing Dimensions (SCD) Component In The Data Flow For The Loading Of Table Data?

Feb 28, 2008

Hi, all experts here,
Do we always have to use SCD component for the loading of data into data warehouse to handle changes of rows?
I am looking forward to hearing from you and thank you very much in advance for your help.
With best regards,

View 4 Replies View Related

Error: The Task With The Name Data Flow Task And The Creation Name DTS.Pipeline.1 Is Not Registered For Use On This Computer

May 4, 2006



Hi,

I am trying to create a simple BI Application for SSIS. In Visual Studio 2005 I just get a Data Flow Task from the toolbar and add it to the project. When I double click it I get the following error:

The task with the name "Data Flow Task" and the creation name "DTS.Pipeline.1" is not registered for use on this computer.

Then when I try to delete it it gives this other error:

Cannot remove the specified item because it was not found in the specified Collection.

I am creating this application in an administrator account in this computer, so I doubt the problem is related to permissions. I am running SQL Server 2005 and Visual Studio 2005 in WinXP Tablet PC Edition.

Any suggestions why this is happening and how to fix it?

View 17 Replies View Related

Compare Performance (Execute SQL Task Insert And Data Flow Task)

Mar 12, 2008



I am using SQL 2005 SSIS. I am joining several large tables and then the move result into another table in the same database.

I would like know which method is faster:


Use Execute SQL Task to insert the result set to the target table

Use the Data Flow Task to insert the result set to the target table. (Use OLE DB source to execute SQL command and then use the SQL destination)
Could you tell me why then other is slower?

Thanks.

View 7 Replies View Related

Can A Result Set From SQL Script Task Be Used As A Source For Data Flow Task?

Oct 2, 2007

I have a stored procedure that is executed via a sql script task that returns a full result set. I map this result set to a variable or object type. Is there a way to use this variable as a data source in a subsequent data flow task?

A.

View 14 Replies View Related

The Data Flow's Default Destination Component

Dec 10, 2007

Is there a default destination component used when a new data flow is created? The reason I ask is simply curiosity. I have an xml file with 2 pieces of data: item A and item B. A should simply get copied out of the file. B should undergo a quick transform. I set up an XML source such that two columns are mapped correctly to the XML source data of A and B. I set up my data transform task as well. So, if I leave those two components on the .dtsx page with no other components, then will there be a default data flow destination already created? ...OR, do you always have to have a destination component?

Thanks for the input. I am just curious.

View 4 Replies View Related

Using Variables In Data Flow Script Component

Jan 12, 2006

I have a package variable that I set via an ExecuteSQL task.  I want to reference it in a data flow script component.  In the Script component I enter the variable into the ReadOnlyVariables collection, then in the script I reference it as Me.Variables.var.  (E.G.  counter = Me.Variables.var)

I'm getting errors when the data flow starts:

Error: 0xC0047062 at Provider, Set Surrogate Key [4261]: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> Microsoft.SqlServer.Dts.Pipeline.ReadWriteVariablesNotAvailableException: The collection of variables locked for read and write access is not available outside of PostExecute.

I have no problem referencing other variables that I have in DerivedColumn transformations.  I've tried putting the variable in the ReadWriteVariables collection but I get the same error.  I don't understand why this is so difficult.  Please help.

View 6 Replies View Related

Passing Variables To Data Flow Component

Apr 27, 2006

Hi,

I've read the various posts and articles regarding this matter, but I seem to have problems getting to work:

In my control flow, I start by declaring a variable named "LastJobLedgerEntryID", to identify the records I need to add to the stage. From there I would like to use this variable in the source component in my dataflow, i.e.:

"SELECT [Entry No_],[Job No_],[Posting Date],[Document No_],[Type],[No_],[Description],[Quantity],[Direct Unit Cost],[Unit Cost],[Unit Price],[Chargeable],[Job Posting Group],[Global Dimension 1 Code],[Global Dimension 2 Code],[Work Type Code] FROM mytable WHERE [Entry No_] > " + @[User::LastJobLedgerEntryID]

But this fails? I should note that the variable LastJobLedgerEntryID is stored as a int32, and with the default value of 0

Could someone please help me with this?

Thanks in advance!

View 5 Replies View Related

Data Flow Source Script Component

Dec 3, 2007

I'm wondering if it is possible to create a flat file source on the fly while bypassing the following step:

On the Connection Managers page, add or create the Flat File connection manager, using a descriptive name such as MyFlatFileSrcConnectionManager. Then close the Script Transformation Editor.

I want to create the connection totally in script, yet i'm having a hard time proving this out...does anybody have any
experience with this?

Ryan

View 4 Replies View Related

UnDoubleError Data Flow Component Problem

Oct 30, 2007



Hello-


I have an SSIS package which I've been using for nearly a year now. Basically the package is responsible for looping through a directory, and importing pipe delimited files into a database.

The issue I'm having is with the UnDoubleError data flow component. I've been using it to remove the qualifiers from the data being imported, in this case it happens to be double-quotes. ex: { " " } I have found that the component will insert a single double-quote { " } when it finds a null string, or two consecutive qualifiers for that matter. If there are two qualifiers with a space seperating them, then it will insert an empty string, or rather a string with one space in it, and the qualifiers will be removed.

The dilemma is since the log files have some columns which are technically NULL, by that I mean columns where there exists only two qualifiers ex: { "" } the UnDoubleError component then inserts a single double-quote. I want to retain the NULL value if it exists, though still use the component to remove the qualifiers when there actually is data there.

Any suggestions?

Thanks for the help...

View 1 Replies View Related

Reference To Preceeding Component From Custom Dataflow Transformation Component

Mar 30, 2006

I am writing a custom dataflow transformation component and I need to get the name of the preceeding component.

I have been trying to find a way to get a reference to the Package object, MainPipe object or IDTSPath90 object (connecting to the IDTSInput90 of my component) from my component because I think from there I can get to the information I want.

Does anyone have any suggestions?

TIA . . . Ed

View 7 Replies View Related

Error Using Row Count Task In Data Flow Task

Dec 20, 2007

Hi,

I'm trying to get a record count out of a databse using OLE DB Source and row count tasks but keep getting an error. I set up a variable as int32 and select the variable name in the row count task and when I go to the Input Columns tab to select a field to count, it gives me this error:

Error at Data Flow Task[Row Count[505]]: The component "Row Count" (505) has forbidden the requested use of the input column with lineage ID 32.

I don't even know what this means?

thanks,

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved