How Can I Check Variables Comes From Data Flow In Predence Constraints And Conditional Transformation?
May 1, 2008
Hi,
I have Variable , data source and conditional transformation which checks the count(*) if the count == 0 then I connect an script component and change variable to false(initial it is True) and write into a log file...
Then I check that variable on predence constarint at workflow if variable==True then success. BUT
Whenever I run the package my dataflow gets green even the condition does not meet like count==0 . So
my variable's value is "False". Actually if the condition doesnt meet then my script shouldnt work.
Am I missing something???
The logic I am trying to recreate via SSIS is the following SQL statement:
insert into db3.dbo.targettable1 -- Target database table (SiteC, Objecte, Attrib1)
select distinct ?, ?, from ? -- Source database table join dbo.targettable2 c1 -- Target database table on c1.Alias = ? and c1.CSetID = ? and c1.FacID = (select f.PFacID from dbo.Fac f where f.FacID = ?) where not exists (select * from dbo.targettable2 c -- Target database table where c.Alias = ? and c.FacID = ? and c.CSetID = ?)
I have an OLE DB Source that consists of an expression to approximate the following portion of the Above Select statement:
Select ?, from ? -- Source database table and
The package has 2 global variables User:CSetID and User::FacID whose scope is global to the package and whose values are set within a Foreach Loop Container outside of the Data Flow Task
I was trying to reference the 2 global variables within the Looup Transformation to recreate the following portion of the SQL statement.but encounter errors:
join dbo.targettable2 c1 -- Target database table on c1.Alias = ? and c1.CSetID = ? and c1.FacID = (select f.PFacID from dbo.Fac f where f.FacID = ?)
In the Advanced Editor window of Lookup Transaction
select * from (select * from [dbo].[targettable2 ]) as refTable where [refTable].[Alias] = ? and [refTable].[FacID] = ? and [refTable].[CSetID] = ?
Is there away to reference global variables in a Lookup Transformation that are set outside a Data Task Flow?
I have an Execute SQL Task that returns a Full Rowset from a SQL Server table and assigns it to a variable objRecs. I connect that to a foreach container with an ADO enumerator using objRecs variable and Rows in first table mode. I defined variables and mapped them to the columns.
I tested this by placing a Script task inside the foreach container and displaying the variables in a messagebox.
Now, for each row, I want to write a record to an MS Access table and then update a column back in the original SQL Server table where I retreived data in the Execute SQL task (i have the primary key). If I drop a Data Flow Task inside my foreach container, how do I pass the variables as input to an OLE DB Destination on the Data Flow?
Also, how would I update the original source table where source.id = objRects.id?
Thank you for your assistance. I have spent the day trying to figure this out (and thought it would be simple), but I am just not getting SSIS. Sorry if this has been covered.
Hi, I'm trying to implement an incremental data pull (Oracle to SQL) based on Andy's blog: http://sqlblog.com/blogs/andy_leonard/archive/2007/07/09/ssis-design-pattern-incremental-loads.aspx
My development machine is decent: 1.86 GHz, Intel core 2 CPU, 3 GB of RAM. However it seems the data flow task gets hung whenever I test the package against the ~6 million row source, as can be seen from these screenshots. I have no memory limitations on the lookup transformation. After the rows have been cached nothing happens. Memory for the dtsdebug process hovers around 1.8 GB and it uses 1-6 percent of CPU resources continuously. I am not using fast load to insert new records into my sql target table. (I am right clicking Sequence Container 3 and executing this container NOT the entire package in the screenshots)
The same package works fine against a similar test table with 150k rows. http://i248.photobucket.com/albums/gg168/boston_sql92/7.jpg http://i248.photobucket.com/albums/gg168/boston_sql92/8.jpg
The weird thing is it only takes 24 minutes for a full refresh of the entire source table from Oracle to the SQL target table. Any hints,advice would be appreciated.
I am trying to use a CTE in an OLE DB Command data flow transformation object. However, when I enter the cte and corresponding query in the SqlCommand field of the OLE DB command editor dialog, I get a syntax error. Can CTE's be used data flow objects? I have been able to use them in an Execute SQL Control Flow Item, but not in any data flow item.
I have one data flow control. Source is SQL server and destination is flat file destination. I have one derived column placed in between these two. This functionality works fine. I would like to sum one column data and count total no. of columns and put it in global variable. How can I achieve it?
I'm creating a custom data flow transformation in c#.
I would like to use expressions within this component in the same way as in the derived column component: specifying the expression as a custom property of an output column, then evaluating this expression for each row of the buffer and using this evaluated expression to populate my output column values.
So I've added an custom expression on my output column, and set its expression type to CPET_NOTIFY
But in the ProcessInput method I don't manage to get the evaluated expressions, when I use exp.Value I get my expression definition and not its evaluation.
Is there a way to get these evaluated expressions ?
I've created my own custom data flow transformation task (using C#) that will parse a fullname and output the various name parts. In the ProvideComponentProperties method, I create 5 output columns (prefix, first, middle, last, and suffix). In the ProcessInput method, I parse the input and add the name parts to the buffer. The bad thing is that I€™m making an assumption on the position of the Full Name input column within the buffer.
I would like the €śuser€? to be able to map their "full name" input column to a known Full Name column so I don€™t have to make any assumptions. This is the first SSIS task I€™ve tried to create and I haven€™t been able to find very many examples online.
In Excel they are showing Sale as -ve quantity and purchase as +ve quantity.
The database has quantity always as +ve figure and a separate column "isPurchase" set to true or false depending on whether purchase or sale.
So I need Derived Column to return a bolean (or int) depending whether quantity is positive or negative. I tried each of the following in the Expression but all of them were invalid expressions.
CASE WHEN [TotalQuantity] > 0 THEN 0 ELSE 1 END
If [TotalQuantity] > 0 THEN 1 ELSE 0 END
IIF ([TotalQuantity] > 0, 1,0)
Can anyone help me with correct syntax, or correct Data Flow Transformation if Derived column is wrong.
I created a custom transform that has a custom interface and is a wizard that uses a web service. It creates custom properties and output columns on the fly. I set the dialog result to Ok and close at the end of the steps. The transform then has the custom fields and output columns I created in the wizard. I've verified this by right clicking on the transform and going to the advanced editor. If I then immediately run the package, the custom fields don't exist in the CustomPropertiesCollection. If I close the package and reopen it, the properties now are gone. If I then go through the wizard again, thus recreating the properties, they stay and don't disappear. The quickest way to get a working transform is to add it to my data flow then save, close and reopen the package and then go through the wizard. Just saving after I add the transform does not help.
Does anyone know what might be causing this very strange problem?
I need to know how to use my private function - created as a scalar-valued-function in SQL Server 2005 - in script component (here a transformation is used) in a data flow task to transform a two-digit-month into a tree-sign-month:
I am importing the values for field Atype from a .csv file as DT_STR, 13 and I need to fit them into a bit type CType field.
When I write the conditional split ((ISNULL(Atype)?"a":Atype)!=(ISNULL(CType)?"9":CType)) it says that the DT_WSTR and DT_I4 types are incompatible and that I need to explicitly cast with a cast operator. I haven't been able to make it work, how to explicitly cast?
I've built a simple custom data flow transformation component following the Hands On Lab (http://www.microsoft.com/downloads/details.aspx?familyid=1C2A7DD2-3EC3-4641-9407-A5A337BEA7D3&displaylang=en) and the Books Online (ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/adc70cc5-f79c-4bb6-8387-f0f2cdfaad11.htm and ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/b694d21f-9919-402d-9192-666c6449b0b7.htm).
All it is supposed to do is create an output column and set its value to the result of calling a web service method (the transformation is synchronous). Everything seems fine, but when I run the data flow task that contains it, it doesn't generate any output. The Visual Studio debugger displays it as yellow, with 1,385 rows going into it, but the data viewer attached to its output is empty. The output metadata looks just like I expect: all of my input columns plus the new column, correctly typed. No validation or run-time warnings or errors are reported.
I'll include the entire C# file below, which only overrrides the ProvideComponentProperties, Validate, PreExecute, ProcessInput, and PostExecute methods of the parent PipelineComponent class.
Since this is effectively a specialization of the DerivedColumn transformation, could I inherit from the class that implements the DC component instead of PipelineComponent? How do I even find out what that class is?
Thanks! Here's the code: using System; // using System.Collections.Generic; // using System.Text;
using Microsoft.SqlServer.Dts.Pipeline; using Microsoft.SqlServer.Dts.Pipeline.Wrapper; using Microsoft.SqlServer.Dts.Runtime.Wrapper;
namespace CustomComponents { [DtsPipelineComponent(DisplayName = "GID", ComponentType = ComponentType.Transform)] public class GidComponent : PipelineComponent { /// /// Column indexes for faster processing. /// private int[] inputColumnBufferIndex; private int outputColumnBufferIndex;
/// /// The GID web service. /// private GID.WS_PDF.PDFProcessService gidService = null;
/// /// Called to initialize/reset the component. /// public override void ProvideComponentProperties() { base.ProvideComponentProperties(); // Remove any existing metadata: base.RemoveAllInputsOutputsAndCustomProperties(); // Create the input and the output: IDTSInput90 input = this.ComponentMetaData.InputCollection.New(); input.Name = "Input"; IDTSOutput90 output = this.ComponentMetaData.OutputCollection.New(); output.Name = "Output"; // The output is synchronous with the input: output.SynchronousInputID = input.ID; // Create the GID output column (16-character Unicode string): IDTSOutputColumn90 outputColumn = output.OutputColumnCollection.New(); outputColumn.Name = "GID"; outputColumn.SetDataTypeProperties(Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_WSTR, 16, 0, 0, 0); }
/// /// Only 1 input and 1 output with 1 column is supported. /// /// public override DTSValidationStatus Validate() { bool cancel = false; DTSValidationStatus status = base.Validate(); if (status == DTSValidationStatus.VS_ISVALID) { // The input and output are created above and should be exactly as specified // (unless someone manually edited the persisted XML): if (ComponentMetaData.InputCollection.Count != 1) { this.ComponentMetaData.FireError(0, ComponentMetaData.Name, "Invalid metadata: component accepts 1 Input.", string.Empty, 0, out cancel); status = DTSValidationStatus.VS_ISCORRUPT; } else if (ComponentMetaData.OutputCollection.Count != 1) { this.ComponentMetaData.FireError(0, ComponentMetaData.Name, "Invalid metadata: component provides 1 Output.", string.Empty, 0, out cancel); status = DTSValidationStatus.VS_ISCORRUPT; } else if (ComponentMetaData.OutputCollection[0].OutputColumnCollection.Count != 1) { this.ComponentMetaData.FireError(0, ComponentMetaData.Name, "Invalid metadata: component Output must be 1 column.", string.Empty, 0, out cancel); status = DTSValidationStatus.VS_ISCORRUPT; } // And the output column should be a Unicode string: else if ((ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].DataType != DataType.DT_WSTR) || (ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].Length != 16)) { ComponentMetaData.FireError(0, ComponentMetaData.Name, "Invalid metadata: component Output column data type must be (DT_WSTR, 16).", string.Empty, 0, out cancel); status = DTSValidationStatus.VS_ISBROKEN; } } return status; }
/// /// Called before executing, to cache the buffer column indexes. /// public override void PreExecute() { base.PreExecute(); // Get the index of each input column in the buffer: IDTSInput90 input = ComponentMetaData.InputCollection[0]; inputColumnBufferIndex = new int[input.InputColumnCollection.Count]; for (int col = 0; col < input.InputColumnCollection.Count; col++) { inputColumnBufferIndex[col] = BufferManager.FindColumnByLineageID(input.Buffer, input.InputColumnCollection[col].LineageID); } // Get the index of the output column in the buffer: IDTSOutput90 output = ComponentMetaData.OutputCollection[0]; outputColumnBufferIndex = BufferManager.FindColumnByLineageID(input.Buffer, output.OutputColumnCollection[0].LineageID); // Get the GID web service: gidService = new GID.WS_PDF.PDFProcessService(); }
/// /// Called to process the buffer: /// Get a new GID and save it in the output column. /// /// /// public override void ProcessInput(int inputID, PipelineBuffer buffer) { if (! buffer.EndOfRowset) { try { while (buffer.NextRow()) { // Set the output column value to a new GID: buffer.SetString(outputColumnBufferIndex, gidService.getGID()); } } catch (System.Exception ex) { bool cancel = false; ComponentMetaData.FireError(0, ComponentMetaData.Name, ex.Message, string.Empty, 0, out cancel); throw new Exception("Could not process input buffer."); } } }
/// /// Called after executing, to clean up. /// public override void PostExecute() { base.PostExecute(); // Resign from the GID service: gidService = null; } } }
I have developed some packages to load data into "Fact" tables in the data warehouse. Some packages are OK, other ones not. What is the problem?: some packages load fact tables with lots of "Lookup - Data Flow Transformation" into the "data flow task" (lookup against dimension tables) but they are very very slow, too much slow to be choosen as a solution.
Do you have any other solutions to avoid using "Lookup - Data Flow Transformation"? Any other solution (SSIS, TSQL and so on....) is welcome to speed up the Fact table loading process.
I'm storing a sql command (could be an SP call or simple select) that I would like to pass into the "OLE DB Source" container in my data flow task. If I choose Data Access Mode: SQL Command from variable, I get an error saying there is a data source column with no name.
I've fought SSIS pretty good up to this point. I realize you need to "initialize" variables & expressions in the control flow ahead of time if you want to access them in the data flow. But I was kind of hoping that all I would need to do in this case would be pass a SQL command into a variable and access that somehow in the data flow without jumping through any hoops. Is this possible?
I have a package variable that I set via an ExecuteSQL task. I want to reference it in a data flow script component. In the Script component I enter the variable into the ReadOnlyVariables collection, then in the script I reference it as Me.Variables.var. (E.G. counter = Me.Variables.var)
I'm getting errors when the data flow starts:
Error: 0xC0047062 at Provider, Set Surrogate Key [4261]: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> Microsoft.SqlServer.Dts.Pipeline.ReadWriteVariablesNotAvailableException: The collection of variables locked for read and write access is not available outside of PostExecute.
I have no problem referencing other variables that I have in DerivedColumn transformations. I've tried putting the variable in the ReadWriteVariables collection but I get the same error. I don't understand why this is so difficult. Please help.
I've read the various posts and articles regarding this matter, but I seem to have problems getting to work:
In my control flow, I start by declaring a variable named "LastJobLedgerEntryID", to identify the records I need to add to the stage. From there I would like to use this variable in the source component in my dataflow, i.e.:
"SELECT [Entry No_],[Job No_],[Posting Date],[Document No_],[Type],[No_],[Description],[Quantity],[Direct Unit Cost],[Unit Cost],[Unit Price],[Chargeable],[Job Posting Group],[Global Dimension 1 Code],[Global Dimension 2 Code],[Work Type Code] FROM mytable WHERE [Entry No_] > " + @[User::LastJobLedgerEntryID]
But this fails? I should note that the variable LastJobLedgerEntryID is stored as a int32, and with the default value of 0
On my office PC I have a strange problem with Data Flow Tasks. When I check-in a package into Source Control (Team Foundation) I usually have some empty Data Flow Tasks in the packages...even if I haven't changed that Flow Task.
This is really frustrating cause when running the package all goes well, but after a whil you notice that some tables are empty.
Hey all, is there any explanation why just opening a data flow task causes a VSS check out?
The issue with this is that I have multiple developers, and when one wants to look at a data flow task in a package already checked out, it generates multiple VS errors.
Not big on the priority list, (not as big as the modal dialog issue) but still a pain.
In my SSIS Data Flow Task, I have a query that retrieves data based on a couple of date parameters. Is there a way we can pass/use the Variables defined in the SSIS package in the query ?
(I am assigning values to those variables from C# code)
The query should look like this:
select ordernumber, customerid from salesorder
where statecode=3 and datefulfilled between @variable1 and @variable2
Using SSIS 2012 (within Visual Studio) on Windows 7.
Before allowing my Data Flow task to fire, I'd like to check the target table (OLE DB Destination) for a specific date value in a specific field. I've seen how the Lookup Task is commonly used to check for dupes before inserting, but I'm not able to use that method because the data value I want to search the table for is contained in a Global Variable (let's say "MyVariableDate").Â
Is there any way to check for any records in a target table where Date1 = MyVariableDate (i.e. scanning the entire table for any occurrence of MyVariableDate in the Date1 field)?
I'm having trouble with a Script Component in a data flow task. I have code that does a SqlCommand.ExecuteReader() call that throws an 'Object reference not set to an instance of an object' error. Thing is, the SqlCommand.ExecuteReader() call is already inside a Try..Catch block. Essentially I have two questions regarding this error:
a) Why doesn't my Catch block catch the exception? b) I've made sure that my SqlCommand object and the SqlConnection property that it uses are properly instantiated, and the query is correct. Any ideas on why it is throwing that exception?
I have huge data and i am loading data from EXCEL to database table, after loading 80 percent data i am getting some error. My package got failed and it has lots of transformation and took around 6 hours to process completely because of that i don't want it to reload from start. if i run it again it should start from next record from where i got the error.
I have three control flow tasks that are executing in consecutive order. Tasks 1 and 3 will always execute, but sometimes (based on an expression) task 2 will not. I would like to use precedence constraints in such a way that task 3 will execute regardless of whether task 2 executes, but in the event task 2 does execute, task 3 will only execute AFTER task 2 completes. Is there a way to accomplish this without setting the disabled property of task 2 at runtime?
I have set up a conditional split task which i want to use with a flat file data source. The flat file consists of multiple rows of data where the first column is an ID. The conditional split is based on the first column value.
What i'd like to know is if in the conditional split once it splits the data can the output be transformed. e.g. If one of the values coming from the flat file requires to be either split up into two values or requires to be passed into a stored procedure to manipulate it, can this be done?
I have to load a flat file to a table, however there are some rows that I need to remove before loading to the table, see below the file structure.
I have to remove both, the blank row and the "**** Federal Do Not Call Registrants ****" as well, I was wondering if Conditional Split Transformation is the right tool to do this, if not, what would you guys recommend me doing.
1112223344|111|222|00000 2223334455|222|333|00000
**** Federal Do Not Call Registrants **** 1112223344|111|222|99999 2223334455|222|333|99999
Hi, I am developing a database for my company in SQL server 2000 and I have some few problems. Firstly.
I have a customers table and orders table in my DB:
Customers Orders --------- ------ CustID (primary key) ----------------< CustID . ^ ProductID . | Quantity . | . . | . etc. | etc. | relationship (one to many)
What I want to do is: 1) to be able to delete a Customer and automaticaly SQL server delete all the orders that this customer done from the Orders table. 2) If for some reason the CustomerID changes, SQL should be able to automaticaly update the necessary fields with the new values in the Orders table. Finally, 3) I want to be able to insert a new customer that has an order and update both the Customers table and Orders table automaticaly. e.g
CustID Name Address ProductID Quantity etc. ------ ---- ------- --------- -------- ---- 10-003 John London 33-25 2 ...
Such a kind of insert should add automaticaly the following entries in the two tables:
Customers Orders --------- ------ CustID (10-003) CustID (10-003) Name (John) ProductID (33-25) Address(London) Quantity (2) . . . . etc. etc.
A friend of mine told me that this can be done using Foreign Check constraints in SQL server. But I do not know what to do.