Is There Away To Reference Global Variables In A Lookup Transformation That Are Set Outside A Data Task Flow?

Mar 31, 2008



The logic I am trying to recreate via SSIS is the following SQL statement:

insert into db3.dbo.targettable1 -- Target database table
(SiteC,
Objecte,
Attrib1)

select distinct ?,
?,
from ? -- Source database table
join dbo.targettable2 c1 -- Target database table
on c1.Alias = ? and
c1.CSetID = ? and
c1.FacID = (select f.PFacID
from dbo.Fac f
where f.FacID = ?)
where not exists (select * from dbo.targettable2 c -- Target database table
where c.Alias = ? and
c.FacID = ? and
c.CSetID = ?)


I have an OLE DB Source that consists of an expression to approximate the following portion of the Above Select statement:

Select ?,
from ? -- Source database table and

The package has 2 global variables User:CSetID and User::FacID whose scope is global to the package and whose values are set within a Foreach Loop Container outside of the Data Flow Task

I was trying to reference the 2 global variables within the Looup Transformation to recreate the following portion of the SQL statement.but encounter errors:



join dbo.targettable2 c1 -- Target database table
on c1.Alias = ? and
c1.CSetID = ? and
c1.FacID = (select f.PFacID
from dbo.Fac f
where f.FacID = ?)


In the Advanced Editor window of Lookup Transaction

select * from
(select * from [dbo].[targettable2 ]) as refTable
where [refTable].[Alias] = ? and [refTable].[FacID] = ? and
[refTable].[CSetID] = ?



Is there away to reference global variables in a Lookup Transformation that are set outside a Data Task Flow?

View 3 Replies


ADVERTISEMENT

Lookup Task Data Flow Transformation Causes Data Flow Task To Hang?

Dec 28, 2007

Hi,
I'm trying to implement an incremental data pull (Oracle to SQL) based on Andy's blog:
http://sqlblog.com/blogs/andy_leonard/archive/2007/07/09/ssis-design-pattern-incremental-loads.aspx

My development machine is decent: 1.86 GHz, Intel core 2 CPU, 3 GB of RAM.
However it seems the data flow task gets hung whenever I test the package against the ~6 million row source, as can be seen from these screenshots. I have no memory limitations on the lookup transformation. After the rows have been cached nothing happens. Memory for the dtsdebug process hovers around 1.8 GB and it uses 1-6 percent of CPU resources continuously. I am not using fast load to insert new records into my sql target table. (I am right clicking Sequence Container 3 and executing this container NOT the entire package in the screenshots)

http://i248.photobucket.com/albums/gg168/boston_sql92/1.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/2.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/3.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/4.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/5.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/6.jpg


The same package works fine against a similar test table with 150k rows.
http://i248.photobucket.com/albums/gg168/boston_sql92/7.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/8.jpg

The weird thing is it only takes 24 minutes for a full refresh of the entire source table from Oracle to the SQL target table.
Any hints,advice would be appreciated.

View 18 Replies View Related

Load Fact Table, Very Slow With Lookup - Data Flow Transformation

Apr 9, 2008



Hello,


I have developed some packages to load data into "Fact" tables in the data warehouse.
Some packages are OK, other ones not. What is the problem?: some packages load fact tables with lots of "Lookup - Data Flow Transformation" into the "data flow task" (lookup against dimension tables) but they are very very slow, too much slow to be choosen as a solution.

Do you have any other solutions to avoid using "Lookup - Data Flow Transformation"? Any other solution (SSIS, TSQL and so on....) is welcome to speed up the Fact table loading process.

Thank in advance

View 7 Replies View Related

How Can I Check Variables Comes From Data Flow In Predence Constraints And Conditional Transformation?

May 1, 2008



Hi,

I have Variable , data source and conditional transformation which checks the count(*) if the count == 0 then I connect an script component and change variable to false(initial it is True) and write into a log file...

Then I check that variable on predence constarint at workflow if variable==True then success. BUT
Whenever I run the package my dataflow gets green even the condition does not meet like count==0 . So
my variable's value is "False". Actually if the condition doesnt meet then my script shouldnt work.
Am I missing something???

View 7 Replies View Related

Change Self Reference Values In A Data Flow Task

May 9, 2006

Hi,

We migrate data from a legacy system to new system using SSIS. The primary key of legacy system is a user-defined sql server which holds alpha-numeric values. The primary key of new system is a big int(sequential numbers).

When we migrate data, we generate a sequential number for each legacy key(the primary key of legacy data) and insert data in to new system tables. The newly generated sequential numbers and the legacy keys are persisted in an intermidiate table for look up operations of child tables.

We are facing problem when we try to migrate tables which has self referring coulumns. For example a table called Employee has a column ManagerKey which refers to Key column of Employee table. We are struck up in defining data flow tasks to replace legacy ManagerKey column values with the new values(sequential values) generated during the migration process.

Please help me to solve this problem.

Regards,

Gopi









View 3 Replies View Related

Functions In A Transformation Of A Script Component In A Data Flow Task

Feb 19, 2008

Hello Helpers,

I need to know how to use my private function - created as a scalar-valued-function in SQL Server 2005 - in script component (here a transformation is used) in a data flow task to transform a two-digit-month into a tree-sign-month:

Example: '01' should be transformed into 'Jan'

Many thanks for alle your commitment and help!

Ulrike

View 4 Replies View Related

Adding Lookup Programmatically:How Can I Add Column From Reference Dataset To The Transformation?

Jun 19, 2007


Hello,
I have created SSIS package programmatically, I want to add Lookup transformation,
How can I add column from reference dataset to the transformation?
I have try to add new output column but it gives me an validation error, I write following coed to add new output column to lookup.
IDTSOutputColumn90 outputColumn = this.lookup.OutputCollection[0].OutputColumnCollection.New();
outputColumn.Name = col.Name;
outputColumn.Description = "Staging table output";
outputColumn.TruncationRowDisposition = DTSRowDisposition.RD_FailComponent;
outputColumn.ErrorOrTruncationOperation = "Copy Column";
outputColumn.SetDataTypeProperties(col.DataType, col.Length, col.Precision, col.Scale, col.CodePage);

Please suggest other way to add column from reference dataset to transformation output.

View 10 Replies View Related

How To Reference Dataflow Elements In A SQL Statement Embedded In A Lookup Transformation?

May 8, 2008



Hi guys,

I need to use a SQL statement to lookup a value from a SQL server database table that relates to a column in my dataflow.

Imagine a SQL database table called 'cars' with values of

Year | Description
2005 Ferrari 355
2005 Ferrari 355 Spider
2006 Ferrari 355 F1

In my data flow I have Year and Model eg.

Year | Model
2005 355
2006 355


In my SQL statement I want to select from the 'cars' table where the years match exactly but the 'description' is like the 'Model'. eg. %355%

So essentially, how do I construct the 'like' clause in the select statement to reference the 'Model' column please?

thanks for your help,

Chris

View 4 Replies View Related

Integration Services :: SSIS - Can Use Variables In A Data Flow Task Command

Jul 2, 2010

In my SSIS Data Flow Task, I have a query that retrieves data based on a couple of date parameters. Is there a way we can pass/use the Variables defined in the SSIS package in the query ?

(I am assigning values to those variables from C# code)

The query should look like this:

select ordernumber, customerid from salesorder

where statecode=3 and datefulfilled between @variable1 and @variable2

View 8 Replies View Related

Integration Services :: Lookup Transformation - Used Destination Table For Reference Using Full Cache Mode

Jul 1, 2015

My source has 2.2 million of records. I'm performing the incremental load.In the lookup transformation i used the destination table for the reference using Full cache mode.For the first time package executed successfully but when i executed the package second time, Suddenly Package hangs while running.Than i truncate the data from the destination table and restart the SQL Server Services.After doing all this i executed package again and it worked but when i executed package second time, again package hangs up .I have 8GB RAM and i5 2.5 GHz Processor laptop.

View 7 Replies View Related

DTS Task Properties And Global Variables

Mar 2, 2004

Hey all,
I have a stored procedure, which need one variable as parameter. I am trying to call this stored procedure from my DTS Task and my parameter is defined as the Global Variable in DTS. here is the SP call within my DTS Task

declare @id int
select @id = DTSGlobalVariables('ClientId' ).value
exec sp_Update_DayPart @ClientId= @id


it gives me an error that DTSGlobalVariables function not defined. In this case how can i pass the value of Client Id which is my global variable to my SP.

Thanks in Advance

View 1 Replies View Related

ExecutePackage Task Global Variables

Jan 8, 2008



I'm trying to pass a global variable from a DTS package to the child packages that it calls using ExecutePackage tasks. I have selected the child's global variable on the Inner Global Variable tab and I have selected the parent's global variable on the Outer Global Variable tab. That doesn't work. Whatever I type into the Value column of the Inner Global Variable tab gets passed to the child package. How do I get the parent's global variable passed to the child package? Do I need to set the value on the Inner Global Variable tab to some special word to make it look for the parent's global variable? If I set it to nothing, nothing gets passed.

I have been able to make this work using an ActiveX Script task. I can set the Inner Global Variable value of the Task object to the parent's global variable value, but that's not the clean solution I'm looking for. There must be a simple way to do this because Microsoft's documentation brags about this feature, but they don't explain exactly how to do it.


Thanks

Steve

View 5 Replies View Related

Setting Global Variables In A Script Task, HOW?

Jan 3, 2007

I'm playing (and trying to learn)...

I have an FTP task in a for each containter and am setting the RemotePath using an expression (works great). Thought I could use this to start learning some of the scripting funtionality in SSIS (in a script task) so found some code in this forum (thanks Original Posters!) and tried my hand at some coding... Intent was to create a variable and then dynamically overwrite the Expression in the FTP Task from the script (I know I don't need to do this, I just wanted to use it for learning purposes)....

I have a variable named varFTPDestPathFileName (string) and want to set it to the value of varFTPDestPath (string) + varFTPFileName (string). Note: all variables are scoped at the package level (could this be the problem?). I did not assign any of the variables to ReadOnly or ReadWrite on the Script Task Editor page (seems to me that doing this in the code is a whole lot cleaner [and self documenting] than on the Task Editor page)...

I keep getting the following error:
"The element cannot be found in a collection. This error happens when you try to retrieve an element from a collection on a container during execution of the package and the element is not there."

Here is the script:

Public Sub Main()
Dim vars As Variables
' Lock for Read/Write the variables we are going to use
Dts.VariableDispenser.LockForRead("User::varFTPDestPath")
Dts.VariableDispenser.LockForRead("User::varFTPFileName")
Dts.VariableDispenser.LockForWrite("User::varSourcePathFileName")
Dts.VariableDispenser.GetVariables(vars)

' Set Value of varSourcePathFileName <<--- ERROR OCCURS HERE
vars("User::varSourcePathFileName").Value = _
Dts.Variables("User::varFTPDestPath").Value.ToString + _
Dts.Variables("User::varFTPFileName").Value.ToString

vars.Unlock()

Dts.TaskResult = Dts.Results.Success

End Sub

I would also like to be able to loop through the Dts.VariableDispensor to see the contents of the variables and their values.

Somthing like

For each ??? in vars
msgbox(???.Value)
Next

One other question... Do we always have to preface the variable with "User::" or "System::", if so can you explain why?

Any help would be much appreciated....

View 17 Replies View Related

Usage Of Global Variables Inside SQL Task

Mar 11, 2008

I've been looking around but haven't yet found the syntax for usage of global variables in an SQL Task.

I've set the global variable Id (see code below):

if (select field from table where id = @[User::id]) is null
select top 1 1 as response from table
else select top 1 0 as response from table

My objective with it is to set another global variable (@isNull). Supposably, when the selection returns null, I should set the variable to null, I did it by using the selections and mapping the response to that variable (is ther a better way to do so?).

When I try to execute this, it says the variable has not been defined.
Here is the error:

Error: Must declare the variable '@'.

I've also tryed it withou the brackets and the User:: thing in the beggining, (@id directly) and here is the response:

Error: Must declare the variable '@id'.

How should I access the global variables in the SQL code?
(BTW, I've checked the field in execution time and it is set to 23, the correct Id, so the block that preceedes this one is working properly)

Thanks,




View 6 Replies View Related

Passing Global Variables From A Execute Package Task

Apr 28, 2004

I have a package (Package1) that is run from another package (Package2) via a Execute Package Task. I set a Global Variable called sErrorMessage in the in Package1 and would like to access that Global Variable in an ActiveX Script Task in Package2. How can I do this?

View 6 Replies View Related

Integration Services :: Possible To Parameter Connection Of Lookup Transformation Task

Aug 14, 2015

Is it possible to parameter the connection of a Lookup Transformation task - specifically the table/view name? I would like to be able to dynamically set the table that the Lookup Transformation is connecting to at runtime.I've looked into the "Use results of an SQL query" on the connection screen (which correlates to the "SqlCommand" property), but I'm unable to pass in a parameter this way.I've also looked into the SqlCommandParam, but that doesn't allow me to use a parameter in the "FROM" clause of the sql syntax.

View 4 Replies View Related

How To Retrieve Global Variables In An ActiveX Script Task Using VBScript In SSIS

Oct 27, 2006

I need to retrieve the Global Variables set in my package configuration file within an ActiveX Script Task within an SSIS package. In DTS, I could access the Global Variables to execute a SQLXMLBulkLoad for the following statement:

==========================================

Function Main()

Response.Expires=-1

set objBL = CreateObject("SQLXMLBulkLoad.SQLXMLBulkLoad")
objBL.ConnectionString =
"provider=SQLOLEDB.1;server=ABC123;database=MyDB;Trusted_Connection=Yes;"
objBL.KeepIdentity = False
objBL.CheckConstraints = False

objBL.Execute DTSGlobalVariables("gv_XSDSchemaFile").Value, DTSGlobalVariables("gv_XMLFullPath").Value

Main = DTSTaskExecResult_Success
set objBL=Nothing

End Function
=========================================

I have tried using the Script Task to write this in VB.NET, however the MSXML4.0 is not exposed within the limited object model of the Script Task Designer. I have written a Data Flow Object using the XML Source, however it requires quite a bit of effort to have the Data Flow Component parse the XML (with 10 hierarchical nodes), transform each and provide a SQL Server Destination. This works, however the XML Source Component requires a hardcoded reference to the XSD Schema file and does not allow for a Global Variable to used. (They do provide this functionality for the XML file source though).

My requirement is to allow for the Global Variable to be passed for the Schema file at runtime. The only way I can think of is to recreate what I was doing in DTS where I could simply pull in the XML and XSD Global Variables and execute the SQLXMLBulkLoad in VB Script.

Any ideas on how to write this in VBScript within the ActiveX Script Task in SSIS?...

Michael

View 1 Replies View Related

HELP: How Do I Pass Variables From Control Flow To Data Flow

Mar 9, 2007

I have an Execute SQL Task that returns a Full Rowset from a SQL Server table and assigns it to a variable objRecs. I connect that to a foreach container with an ADO enumerator using objRecs variable and Rows in first table mode. I defined variables and mapped them to the columns.

I tested this by placing a Script task inside the foreach container and displaying the variables in a messagebox.

Now, for each row, I want to write a record to an MS Access table and then update a column back in the original SQL Server table where I retreived data in the Execute SQL task (i have the primary key). If I drop a Data Flow Task inside my foreach container, how do I pass the variables as input to an OLE DB Destination on the Data Flow?

Also, how would I update the original source table where source.id = objRects.id?

Thank you for your assistance. I have spent the day trying to figure this out (and thought it would be simple), but I am just not getting SSIS. Sorry if this has been covered.

Thanks,

Steve

View 17 Replies View Related

SSIS Variables Between Data Flow And Control Flow... How To????

May 17, 2007

Hi everyone,

Primary platform is 64 bit cluster.

How to move information allocated in SSIS variables from Data Flow to Control Flow layers??

We've got a SSIS package which load a value into a variable inside a Data Flow. Going back to Control Flow how could we retrive that value again????

Thanks in advance and regards,

View 4 Replies View Related

Lookup Transformation (Can It Be Using The Old Data?)

Mar 31, 2008



I have a lookup transformation that retrieves a key for a certain column of values, in this case, a name. So, I go in to the lookup table with a name and come out with its key. I had it working and then I added new entries to the lookup table for a bunch of new names. Now, for some reason, I am not getting the matches for the new names. But I am still getting the matches for the names that existed before I added the new ones.

I'm wondering if the lookup transformation is using the old set of data and some how not picking up the new names. Do I have to trigger something in the lookup transformation to let it know that the lookup table data has changed?

View 4 Replies View Related

CTE In OLE DB Command Data Flow Transformation

Dec 20, 2006

I am trying to use a CTE in an OLE DB Command data flow transformation object. However, when I enter the cte and corresponding query in the SqlCommand field of the OLE DB command editor dialog, I get a syntax error. Can CTE's be used data flow objects? I have been able to use them in an Execute SQL Control Flow Item, but not in any data flow item.

View 7 Replies View Related

Data Flow Transformation: Count And Sum The Column Value

Nov 8, 2007

Hi,

I have one data flow control. Source is SQL server and destination is flat file destination. I have one derived column placed in between these two. This functionality works fine. I would like to sum one column data and count total no. of columns and put it in global variable. How can I achieve it?

Thanks,

View 7 Replies View Related

Using Expressions In Custom Data Flow Transformation

Sep 18, 2006

Hi,

I'm creating a custom data flow transformation in c#.

I would like to use expressions within this component in the same way as in the derived column component: specifying the expression as a custom property of an output column, then evaluating this expression for each row of the buffer and using this evaluated expression to populate my output column values.

So I've added an custom expression on my output column, and set its expression type to CPET_NOTIFY

IDTSCustomProperty90 exp = col.CustomPropertyCollection.New();

exp.Name = "Expression";

exp.ExpressionType = DTSCustomPropertyExpressionType.CPET_NOTIFY;

But in the ProcessInput method I don't manage to get the evaluated expressions, when I use exp.Value I get my expression definition and not its evaluation.

Is there a way to get these evaluated expressions ?

Thanks,

Stéphane

View 6 Replies View Related

Advanced Lookup In A Data Flow

Oct 4, 2006

Here is the deal - I have a flat file with 2 fields:
ProdNum FeatureCodes
1 A01, B22, F09
2 C13,C24,E05,G02,G09,J07,J09,M03,M17
3 J07,M01,M17,N02,N11,N13,X15

The I have Excel file like:
Code Description
A01 Handicap features
B22 Smoke-Free
F09 Tinted Windows
C13 Picnic Area
C24 Extra Storage
J07 Tile Flooring
M01 Central Airconditioning

The result in a database needs to be like:
ProdNum Features
1 Handicap features, Smoke-Free, Tinted Windows
2 Picnic Area, Extra Storage, .....
3 Tile Flooring, Central Airconditioning, .....


How can I do this in a data flow - any ideas???

View 8 Replies View Related

One-time Data Flow Lookup

Jan 15, 2007

(Just getting started with SSIS) I have a data flow task with source S1 and destination D. The mappings from S1 to D are straightforward, but additionally, I need to lookup a value from source S2 one time and map the value a column in D for every row of S1. Note that this lookup based on a constant value independent of S1, and therefore no column mappings are involved (this lookup is essentially "standalone").

I would expect I could do the lookup, assign the value to a data flow level variable, then reference that variable in a derived column transform or some such thing, but I'm having trouble figuring out how to do the lookup in the data flow task and assign it to a variable. Am I on the right track here, or...?

View 9 Replies View Related

Custom Data Flow Transformation - Predefined Inputs

May 16, 2008

Hi,

I've created my own custom data flow transformation task (using C#) that
will parse a fullname and output the various name parts. In the
ProvideComponentProperties method, I create 5 output columns (prefix, first, middle,
last, and suffix). In the ProcessInput method, I parse the input and add the
name parts to the buffer. The bad thing is that I€™m making an assumption on
the position of the Full Name input column within the buffer.

I would like the €œuser€? to be able to map their "full name" input column to a known Full Name column so I don€™t have to make any assumptions. This is the first
SSIS task I€™ve tried to create and I haven€™t been able to find very many
examples online.

Any help is greatly appreciated!

Thank you,

Marshall

View 1 Replies View Related

Simple IF Or CASE Statement In Data Flow Transformation

May 11, 2007

Using SSIS to import from Excel to SQL Server.

In Excel they are showing Sale as -ve quantity and purchase as +ve quantity.

The database has quantity always as +ve figure and a separate column "isPurchase" set to true or false depending on whether purchase or sale.



So I need Derived Column to return a bolean (or int) depending whether quantity is positive or negative. I tried each of the following in the Expression but all of them were invalid expressions.

CASE WHEN [TotalQuantity] > 0 THEN 0 ELSE 1 END

If [TotalQuantity] > 0 THEN 1 ELSE 0 END

IIF ([TotalQuantity] > 0, 1,0)



Can anyone help me with correct syntax, or correct Data Flow Transformation if Derived column is wrong.



Thanks



Richard

View 3 Replies View Related

Custom Data Flow Transformation Loosing CustomProperties

May 12, 2006

I created a custom transform that has a custom interface and is a wizard that uses a web service. It creates custom properties and output columns on the fly. I set the dialog result to Ok and close at the end of the steps. The transform then has the custom fields and output columns I created in the wizard. I've verified this by right clicking on the transform and going to the advanced editor. If I then immediately run the package, the custom fields don't exist in the CustomPropertiesCollection. If I close the package and reopen it, the properties now are gone. If I then go through the wizard again, thus recreating the properties, they stay and don't disappear. The quickest way to get a working transform is to add it to my data flow then save, close and reopen the package and then go through the wizard. Just saving after I add the transform does not help.

Does anyone know what might be causing this very strange problem?

View 7 Replies View Related

Simple Custom Data Flow Transformation Doesn't Produce Any Output

Aug 30, 2006

I've built a simple custom data flow transformation component following the Hands On Lab (http://www.microsoft.com/downloads/details.aspx?familyid=1C2A7DD2-3EC3-4641-9407-A5A337BEA7D3&displaylang=en) and the Books Online (ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/adc70cc5-f79c-4bb6-8387-f0f2cdfaad11.htm and ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/b694d21f-9919-402d-9192-666c6449b0b7.htm).

All it is supposed to do is create an output column and set its value to the result of calling a web service method (the transformation is synchronous). Everything seems fine, but when I run the data flow task that contains it, it doesn't generate any output. The Visual Studio debugger displays it as yellow, with 1,385 rows going into it, but the data viewer attached to its output is empty. The output metadata looks just like I expect: all of my input columns plus the new column, correctly typed. No validation or run-time warnings or errors are reported.

I'll include the entire C# file below, which only overrrides the ProvideComponentProperties, Validate, PreExecute, ProcessInput, and PostExecute methods of the parent PipelineComponent class.

Since this is effectively a specialization of the DerivedColumn transformation, could I inherit from the class that implements the DC component instead of PipelineComponent? How do I even find out what that class is?

Thanks! Here's the code:
using System;
// using System.Collections.Generic;
// using System.Text;

using Microsoft.SqlServer.Dts.Pipeline;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;

namespace CustomComponents
{
[DtsPipelineComponent(DisplayName = "GID", ComponentType = ComponentType.Transform)]
public class GidComponent : PipelineComponent
{
///
/// Column indexes for faster processing.
///
private int[] inputColumnBufferIndex;
private int outputColumnBufferIndex;

///
/// The GID web service.
///
private GID.WS_PDF.PDFProcessService gidService = null;

///
/// Called to initialize/reset the component.
///
public override void ProvideComponentProperties()
{
base.ProvideComponentProperties();
// Remove any existing metadata:
base.RemoveAllInputsOutputsAndCustomProperties();
// Create the input and the output:
IDTSInput90 input = this.ComponentMetaData.InputCollection.New();
input.Name = "Input";
IDTSOutput90 output = this.ComponentMetaData.OutputCollection.New();
output.Name = "Output";
// The output is synchronous with the input:
output.SynchronousInputID = input.ID;
// Create the GID output column (16-character Unicode string):
IDTSOutputColumn90 outputColumn = output.OutputColumnCollection.New();
outputColumn.Name = "GID";
outputColumn.SetDataTypeProperties(Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_WSTR, 16, 0, 0, 0);
}

///
/// Only 1 input and 1 output with 1 column is supported.
///
///
public override DTSValidationStatus Validate()
{
bool cancel = false;
DTSValidationStatus status = base.Validate();
if (status == DTSValidationStatus.VS_ISVALID)
{
// The input and output are created above and should be exactly as specified
// (unless someone manually edited the persisted XML):
if (ComponentMetaData.InputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component accepts 1 Input.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component provides 1 Output.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection[0].OutputColumnCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output must be 1 column.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
// And the output column should be a Unicode string:
else if ((ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].DataType != DataType.DT_WSTR) ||
(ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].Length != 16))
{
ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output column data type must be (DT_WSTR, 16).",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISBROKEN;
}
}
return status;
}

///
/// Called before executing, to cache the buffer column indexes.
///
public override void PreExecute()
{
base.PreExecute();
// Get the index of each input column in the buffer:
IDTSInput90 input = ComponentMetaData.InputCollection[0];
inputColumnBufferIndex = new int[input.InputColumnCollection.Count];
for (int col = 0; col < input.InputColumnCollection.Count; col++)
{
inputColumnBufferIndex[col] = BufferManager.FindColumnByLineageID(input.Buffer, input.InputColumnCollection[col].LineageID);
}
// Get the index of the output column in the buffer:
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];
outputColumnBufferIndex = BufferManager.FindColumnByLineageID(input.Buffer, output.OutputColumnCollection[0].LineageID);
// Get the GID web service:
gidService = new GID.WS_PDF.PDFProcessService();
}

///
/// Called to process the buffer:
/// Get a new GID and save it in the output column.
///
///
///
public override void ProcessInput(int inputID, PipelineBuffer buffer)
{
if (! buffer.EndOfRowset)
{
try
{
while (buffer.NextRow())
{
// Set the output column value to a new GID:
buffer.SetString(outputColumnBufferIndex, gidService.getGID());
}
}
catch (System.Exception ex)
{
bool cancel = false;
ComponentMetaData.FireError(0, ComponentMetaData.Name, ex.Message, string.Empty, 0, out cancel);
throw new Exception("Could not process input buffer.");
}
}
}

///
/// Called after executing, to clean up.
///
public override void PostExecute()
{
base.PostExecute();
// Resign from the GID service:
gidService = null;
}
}
}

View 1 Replies View Related

Variables In Data Flow OLE DB

Feb 2, 2007

I'm storing a sql command (could be an SP call or simple select) that I would like to pass into the "OLE DB Source" container in my data flow task. If I choose Data Access Mode: SQL Command from variable, I get an error saying there is a data source column with no name.



I've fought SSIS pretty good up to this point. I realize you need to "initialize" variables & expressions in the control flow ahead of time if you want to access them in the data flow. But I was kind of hoping that all I would need to do in this case would be pass a SQL command into a variable and access that somehow in the data flow without jumping through any hoops. Is this possible?



This is a great forum btw, thanks,

Phil



View 15 Replies View Related

Error: The Task With The Name Data Flow Task And The Creation Name DTS.Pipeline.1 Is Not Registered For Use On This Computer

May 4, 2006



Hi,

I am trying to create a simple BI Application for SSIS. In Visual Studio 2005 I just get a Data Flow Task from the toolbar and add it to the project. When I double click it I get the following error:

The task with the name "Data Flow Task" and the creation name "DTS.Pipeline.1" is not registered for use on this computer.

Then when I try to delete it it gives this other error:

Cannot remove the specified item because it was not found in the specified Collection.

I am creating this application in an administrator account in this computer, so I doubt the problem is related to permissions. I am running SQL Server 2005 and Visual Studio 2005 in WinXP Tablet PC Edition.

Any suggestions why this is happening and how to fix it?

View 17 Replies View Related

Compare Performance (Execute SQL Task Insert And Data Flow Task)

Mar 12, 2008



I am using SQL 2005 SSIS. I am joining several large tables and then the move result into another table in the same database.

I would like know which method is faster:


Use Execute SQL Task to insert the result set to the target table

Use the Data Flow Task to insert the result set to the target table. (Use OLE DB source to execute SQL command and then use the SQL destination)
Could you tell me why then other is slower?

Thanks.

View 7 Replies View Related

Can A Result Set From SQL Script Task Be Used As A Source For Data Flow Task?

Oct 2, 2007

I have a stored procedure that is executed via a sql script task that returns a full result set. I map this result set to a variable or object type. Is there a way to use this variable as a data source in a subsequent data flow task?

A.

View 14 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved