Can We Format The Data In Dataflow Tasks?

Dec 27, 2007



hi all;
i have a doubt;
can we modify the data / format the data in a dataflow task?
Requirement

1. Excel file source
2. Convert data "Dec-07"(Mon-YY format) into "Dec-2007"(Mon-YYYY)
3. OLEDB Destination

how to do the second step?

View 8 Replies


ADVERTISEMENT

BIG SSIS PACKAGE (200 DATAFLOW TASKS)

Mar 29, 2008

Thanks in advance in reading this thread.



I have developed a big SSIS package to extract data from flat-files ( + 200 Dataflows ).

The situation is the following, inside de SSIS package, there are a lot of validations before extracting & loading the flat-files, i'm running this validations in paralell, so that when a file arrives, it enters the "validation process" and start extracting the file.

When i run the SSIS package from BIDS it works the way i have concepted it... but when i run the ssis in the server, the tables that are loaded through the process are only "available" when the SSIS PACKAGE ends, it is imperative that trough the process, when a table receives new data, it becomes ready, and don't just be available when the SSIS package finishes...

I have attached the an lousing .jpeg.

It is importart for the tables to be available, so the stored procedures(OUTSIDE SSIS PACKAGE) that are dependent of some tables, start working before the SSIS package Ends.


Thanks in Advance.

View 4 Replies View Related

Changeing Dataflow Tasks Propeties By Coding

Mar 27, 2007

Hi,
Is it possible to change the Dataflow tasks properties with VB coding? I've managed to change the properties of the control Flow tasks with VB code! as I wan to create a generic dataflow that will change everytime I run it.
Cheers

View 6 Replies View Related

SSIS PACKAGE + 200 DataFlow Tasks && T-SQL Validations

Mar 30, 2008

Thanks in advance in reading this thread.



I have developed a big SSIS package to extract data from flat-files ( + 200 Dataflows ).

The situation is the following, inside de SSIS package, there are a lot of validations before extracting & loading the flat-files, i'm running this validations in paralell, so that when a file arrives, it enters the "validation process" and start extracting the file.

When i run the SSIS package from BIDS it works the way i have concepted it... but when i run the ssis in the server, the tables that are loaded through the process are only "available" when the SSIS PACKAGE ends, it is imperative that trough the process, when a table receives new data, it becomes ready, and don't just be available when the SSIS package finishes...

I have attached the an lousing .jpeg.

It is importart for the tables to be available, so the stored procedures(OUTSIDE SSIS PACKAGE) that are dependent of some tables, start working before the SSIS package Ends.


Thanks in Advance.

View 5 Replies View Related

Setup 5 Dataflow Tasks In Series Or Parallel?

Jan 19, 2008

I've create a package that currently uses 5 DataFlow tasks connected in series to get data from 5 different files and place that information into 5 different temp tables. Each Dataflow task contains only a OLE Source, a row count and a OLE destination. My question is - Is it normal practise to keep each of these separate, or should I put them all into a single DataFlow? The package should only continue if all five dataflow task complete successfully.

View 7 Replies View Related

Special Error Handle In A Dataflow Transformation Tasks

Jan 22, 2008

Hello,


How would you do a log in a massive rows loading, I'm having problems because every row error(because of casting, format, lookup) in a transformation task is redirected to a text file as a log, this is ok when only exist one error by row, but in the case when I have two errors in the same row detected by diferents transformation tasks only the first one is reported to the text file, I have to wait to the second information load, after I correct the first error, to find the second one, I need to validate as many errors exists by row in the same load...

which component or which strategy can I use in a SSIS Packge to achieve this?

thanks

View 1 Replies View Related

Programatically Creating Dataflow Tasks Require Assembly Reference

Jan 17, 2007

Hi,

I am creating dataflow tasj using following

Imports Microsoft.SqlServer.Dts.Runtime

Imports Microsoft.SqlServer.Dts.Pipeline

Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper

These refer to Microsoft.SqlServer.DTSPipelineWrap.dll and Microsoft.SQLServer.DTSRuntimeWrap.dll. While these assemblies were already there in my dev machine I don't find these files in production enviornment for SQL server 2005. I am refering these assemblies from following path in my local machine : C:Program FilesMicrosoft SQL Server90SDKAssemblies.

How to install these assemblies in prod env, offcource one option is to copy it and then put it in GAC thru script . Why does not it gets installed while installation of SQL server 2005. Are these assemly dependent on SP1 ?

Thanks

Mohit

View 4 Replies View Related

Execute A Query Inside Dataflow And Use The Fields Returned To Continue Dataflow... How?

Apr 17, 2007

Dear Friends,

I need to execute a SQL query, inside a dataflow (not in controlFlow) and need the records returned to continue the dataflow... In my case I cant use lookup and OLE DB COmmand and nothing else...

I need to execute a query and need the records for dataflow... with OLE DB command I cant see the fields returned... :-(

How can I do it? Using a script? Can I use a Script Component? That receive 2 parameters for input and give me the fields returned from query as output?

Thanks!!

View 38 Replies View Related

Make 2 Data Source In Dataflow

Oct 14, 2007



Hi,
I want to create pckg with two data source,
the pckg will know to take data from the right DS after IF will check if getdate() returned endweekday it will take from DS1 else DS2.

any suggestion how to do it??
10X.

View 1 Replies View Related

I Want Ot Convert Xml Data Format Into The General Data Format

Jun 9, 2006

hi siri have table hh .it has two columnsone is hhno is integer datatype another hhdoc is xml data type likehh tablehhno hhdoc---------------------------------------------------------------------------------------------100<suresh>sfjfjfjfjf</suresh>....................................101<ramesh>hhfhfhf</ramesh>..................................how to convert the xml data format into the general data format plshelp me with examples

View 1 Replies View Related

Pass Data Between Two Data Flow Tasks

Jul 16, 2007

Hi Guys,



I have yet another question. How can i pass data b/w 2 data flow tasks? I'm trying to get some data from one sql server and then I want to pass this whole bunch of data to another data flow task which is going to get some data from second sql server. I would probably combine them using union all and then save that as a transaction in a third sql server?

Information regarding how to pass the data with some detailed discussion would be fine with me.



thanks

Gemma

View 5 Replies View Related

DataFlow Problem Moving Data From SQLServer To Oracle

Feb 26, 2008

Hi pals,

I have a small problem in Datflow task.

Basically i am having two tables(similar to my current requirement). One in sql server 2005 and another in ORacle

SQL Table
create table t1
( c1 numeric(6),
c2 numeric(6),
c3 varchar(50),
year numeric(4),
district varchar(10)
)


Oracle Table
create table app_t1
( c1 number(6),
c2 number(6),
c3 varchar2(50),
year number(4),
district varchar2(10),
constraint cpk primary key (c1,year,district)
)

I am using 2 global variable for passing values to c4 and c5 columns in oracle table.
The variable names are g_year and g_district.
i am using a dataflow task to move data from sql server 2005 to oracle 9i database
I am able to load the data from sql server to oracle.
Till here it is fine.
But the requirement is if i get same data i.e same dist and same year then we need to delete the data
in oracle table as
"delete from app_t1 where year = ? and district = ?"

Immediately i need to issue a commit statement.

and thereafter i need to load data for that particular district and year.


To accomplish this requirement i am using ,

An Execute SQL Task to Delete the data from ORacle
An Execute SQL Task to COMMIT the transaction
Finally i am loading data from sqlserver to oracle using a dataflow task.

But i am getting an error as
"Value violated the integrity constraints for a column or table.".

I think the deletion operation is not getting committed.

So what i did was i connected to oracle and deleted the data using above DELETE statement
and again come back to SSIS package and ran the package once again, this time it is successfull.

Why is it so. Any thoughts on this?

Any help would be greatly appreciated.
Thanks!

View 13 Replies View Related

Dynamic Excel Destination Depend On Dataflow Data

Jul 10, 2007

I created a data flow with complaicated SQL. There is "type" field in the output column.

I would like to created excel files for each "type" value

E.g. If there is 3 "type" values (A, B, C), I would like to create 3 excel files to store type A, type B, and type C data respectively.



Since the number of possibe value of "type" field is various, how can I create the xls destination dynamic and move the correct type to the corresponding excel file?



The conditional split has fixed conditions, it is not suitable for by dynamic number of value

For Loop condition is not a good choice because I need to run the complicated SQL for many time.



Thanks.

View 1 Replies View Related

Tasks Import Data

Oct 4, 2007

Hello All,

When I'm trying to import the next file..,
www.triooo.be/Screenshots/TestText_NL.xls

I get the next error.


Messages
* Error 0xc020901c: Data Flow Task: There was an error with output column "SectieTekst NL" (75) on output "Excel Source Output" (9). The column status returned was: "Text was truncated or one or more characters had no match in the target code page.".
(SQL Server Import and Export Wizard)

I thought it was a problem with field length,
but when I set the fields -nvarchar- from 250 to max it still dosn't work.

can some one please try this and give me a possible awnser

Regards Bart

View 2 Replies View Related

DTS: Data Driven Query Tasks

Jul 1, 1999

I would like to use a Data Driven Query Task to conditionally update/insert some data from a source table into a dest. table, but I can't find any decent doc or examples on this. Is there any place that explains in gory detail how to use these DTS tasks?

Thanks,
Patrick

View 1 Replies View Related

Cannot Load Data Flow Tasks

Aug 3, 2007

When I copy over an SSIS package I have been developing from my laptop to my desktop with Windows File Sharing (shared folders) across a home network, the moved package fails to load properly. I can see the Control Flow tasks but not the Data Flow tasks - they simply disappear! I have update the Connection Managers to point to the new machine, and tested them (OK).

Its the same as this issue posted here


And Here


Creating an empty package and clicking to create a new data flow cause this error:

TITLE: Microsoft Visual Studio
------------------------------

Failed to create the task.

------------------------------
ADDITIONAL INFORMATION:

The designer could not be initialized. Microsoft.DataTransformationServices.Design)



Seems the only way is to uninstall and reinstall Client Components.

How do I uninstall and reinstall Client Components? when I try to do that, SQL Server Setup tells me "None of the selected features can be installed or upgraded. Setup cannot procees since there is no effective change being made to the machine...."

I had selected "Workstation components, Books online and Development tools". This is already installed - so how I can reinstall it if I get this message above which does not let me proceed.


View 2 Replies View Related

Why Dataflow Component Doesn't Appear In The List Of SSIS Data Flow Items?

Sep 5, 2007

Hi,
I developed SSIS Data Flow Component and placed dll file into the DTSPipelinecomponents. Then I registered the component in the GAC.

But when I try to add the required component into toolbox that there is not this one in the list of SSIS Data Flow Items. What does it mean?

Thanks in advance.

View 3 Replies View Related

SSIS - Data Loading Tasks - Fallouts?

Oct 26, 2007



Hello -- I primarily use SSIS for data loading tasks from falt files into SQL Server DB.

I usually have the Loading Fallouts routed to a FALLOUT Table with SSIS Err Codes.

However, the package execution stops if there are a bunch of fallouts.. like more than 20/30..

Is there any place in the package where i can specify a bigger Number, rather than Err'ng out for more than 20/30??


Thank You!

View 1 Replies View Related

Re : Enabling / Disabling Data Flow Tasks

Apr 17, 2006

Hello,



I have created around 10 seperate packages for our application data load. Now I am planning to create a master package (or a wrapper package) which will execute all the 10 packages (thru execute package task). Then I have a job which executes the master package at a given date and time.

Question : How can I enable / disable execution of each package within the master package depending upon a flag variable. The reason why I need this mechanism is if the flag = 0 then I don't want all 10 packages within master package to execute and if flag = 1 then master package execution should begin and subsequently execute all packages within that master package.





Thank you

Jatin Shah

View 3 Replies View Related

Maximum Data Flow Tasks Execution

Sep 10, 2006

Hi guys,

i got a foreach loop that has about 20 data flow tasks(same database connections but different extractions) but i notice that when i execute the project it only runs 4 data flow tasks at a time.



i know that there is an option for each data flow to set the "Engine Threads", but is there a way to set the thereads in a foreach loop or for the whole project so it will execute all data flow tasks in one go for each loop.



please help???

View 3 Replies View Related

Asynchronous Data Flow Tasks How To Run More Than 4 At A Time

Sep 25, 2006

Hi guys,

i have a for each loop and it has about 20 data flow tasks (simple data extractions). i notice when i run the package it only runs up to 4 data flow tasks at a time. others have to wait till one of the first 4 flows finishes.

i was wondering if there's a way to change the limit of how many data flow tasks can run at a time. is there a property some where ?

i know this will be stressfull to the server, but the server is well equiped with CPU power and memory, so performance will not be an issue.

any thoughts?

View 1 Replies View Related

Tramsfer Data Fron Acces(ABC.mdb) Sorce To Oledb Destination Using Dataflow Task..

Feb 12, 2008



Hi,
I m totally new in SSIS Programming.
I want to transfer ABC.mdb table to my oit_imp_temp table.
Can you send me refrence with (above requiments) code that wroks for u !
thanks a lot !



View 3 Replies View Related

Adding Numeric Data To A Dataflow In A Scriptcomponent-&&> Invalid Number In OLE Db Command Transformation

Jun 9, 2006

Hi there,

This seems a bug to me. Or does anyone has a logical explanation that escapes me?

When in SSIS Designer Version 9.00.1399.00 I add output columns (numeric 4,0 ) to a scriptcomponent and fill them with valid numeric data in thescript I get a database error 'invalid number' when I use these columns in an OLE db Command-transformation . This errormessage disappears when I replaces those columns by a dataconversion to the datatype they originally have.

Derived Column Name Derived Column Expression

STATUS_DEF Replace 'STATUS_DEF' (DT_NUMERIC,4,0)STATUS_DEF

Maybethis info is usefull for somebody else who can't figure out wathever he's doing wrong.



Paul Baudouin









View 1 Replies View Related

Integration Services :: Dataflow Task Read CSV File And Insert Data To Table

Apr 29, 2015

I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file.

Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns in it. These files needs to be migrated to SQL tables using the optimum way.

Which is the best way to setup the Dataflow task for this requirement?Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.

View 10 Replies View Related

Tramsfer Data Fron Acces(ABC.mdb) Sorce To Oledb Destination Using Dataflow Task..

Feb 12, 2008



Hi,
I m totally new in SSIS Programming.
I want to transfer ABC.mdb table to my oit_imp_temp table.
Can you send me refrence with (above requiments) code that wroks for u !
thanks a lot !

View 1 Replies View Related

Opening A Data Flow Tasks Forces A VSS Check Out

Mar 9, 2007

Hey all, is there any explanation why just opening a data flow task causes a VSS check out?

The issue with this is that I have multiple developers, and when one wants to look at a data flow task in a package already checked out, it generates multiple VS errors.

Not big on the priority list, (not as big as the modal dialog issue) but still a pain.



Thanks!

BobP

View 4 Replies View Related

Multiple Data Flow Tasks In One SSIS Package

Jul 25, 2007

What are the advantages and disadvantages of having multiple data flow tasks in one SSIS package?



Is this a good idea at all considering the workflow may be similar now but may change in the future? Should it be left as one data flow per package?

View 1 Replies View Related

Creating A Parameter File For The Data Flow Tasks ??? VerY Urgent

Jan 2, 2008



Hi,

I need to parameterize some values in the data flow so that i can chnage the values directly in parameter file and re run the data flow for new value in the passed in the parameter. This can be easy for other who do not know about the flow of data flow task as to where to change the variable/parameter.

How can this be accomplished. I want the data flow task to refer to this file before it starts executing and pick the appropriate value from the file.

Or is their any better way to accompalish what i want to do here in SSIS.???

tHNAKS FOR UR HELP FOLKS !!!

View 2 Replies View Related

Integration Services :: Using Sensitive Project Parameters In Data Flow Tasks

Feb 11, 2014

I have a requirement to read an encrypted file as a data source. I am not allowed to save an unencrypted text file version on disc  at any time for any length of time, therefore I created a custom source component that reads an encrypted csv file, decrypts it, and then passes each row of data to the pipeline and ultimately to an ole data destination. Basically it is just a text file reader with an added class that adds functionality that decrypts the file before the component sets columns or reads rows. 

The custom component, “Encrypted File Source”, has a custom property “encryptionkey” with the encryption required flag set to true (code below) and is declared as eligible to be set in the expressions.

IDTSCustomProperty100 EncryptionKey = ComponentMetaData.CustomPropertyCollection.New();
            EncryptionKey.Name =
"EncryptionKey";
            EncryptionKey.Description =
"Secure String key value to decrypt the file";
            EncryptionKey.Value =
string.Empty;
            EncryptionKey.ExpressionType =
DTSCustomPropertyExpressionType.CPET_NOTIFY;
            EncryptionKey.EncryptionRequired =
true;

I want to be able to set the password for the encrypted file in the SQL Agent job that executes the SSIS project. This means I have an environment with a variable, “DataPassword”, that is set to sensitive.  It maps to a Project parameter in the SQL Agent job that is also set to sensitive.  And I now I want to access that sensitive Project Password inside my data flow, specifically in the Encrypted File source task that I created and set my EncryptionKey to that Project Parameter. 

The problem is that SSIS says. 

"expression cannot be evaluated.  ... The Expression will not be evaluated because it contains sensitive parameter value "$Project::DataFilePassord" . Verify that the expression is used properly and that it portects sensitive information"
((Microsoft.DataTransformationsServices.Controls) "<v:shapetype coordsize="21600,21600" filled="f" id="_x0000_t75" o:preferrelative="t" o:spt="75"
path="m@4@5l@4@11@9@11@9@5xe" stroked="f">

[Code] ....

I am using SQL Server 2012, on a windows 7 box with VS2010 premium.

View 4 Replies View Related

Programmatically Iterating Tasks/components In The Data Flow Portion Of A Package.

Mar 6, 2007

HI All,

In several threads there has been discussion regarding adding connection managers to a package's data flow, etc. My challenge is that I have a large solution that contains many packages, and I need to change the connection manager linked to the data flow in all of the packages. When the solution was initially designed, data sources were used, and it has become a tedious maintenance issue to keep those in sync. We want to use a standard OLEDB connection manager, but adding a connection manager to each package and editing the corresponding data flow tasks in each package to use that new connection manager is a daunting task. I've coded a .Net module to access the packages, remove the old connection manager (data source) and add the new OLEDB data source. However, as I traverse the objects in the package hierarchy, when I come to the data flow object, the innerobject is not a dts object, but rather a _com object.. I can't seem to find any documentation/examples as to how to iterate the tasks within a data flow and change the connection manager. If you have any information, that would be quite helpful. If you reply with a code sample, if you would be so kind as to relate it to one of the sample packages provided with SSIS so I can run it, that would be great.

Thank you.

Steve.

View 1 Replies View Related

What Data Mining Tasks Can Be Automated And Scheduled Via Integration Services Packages?

Jun 14, 2006

Hi, all here,

Would please any expert here give me any guidance about what Data Mining tasks can be automated and scheduled via Integration Services Packages? Also, If we automated the tasks, can we also automatically save the results of the tasks somewhere? Like if we automate assessing the accuracy of a mining model, then we wanna know the mining model accuracy later, therefore, we need to save all these results from the automated actions. Is it possible to realize this?

Thanks a lot in advance for any guidance and help for this.

With best regards,

Yours sincerely,

View 3 Replies View Related

Urgent. Output Columns Are Not Appearing When I Use OLEDB Data Source With An Oracle Stored Procedure In Dataflow Task

Nov 12, 2007

I am using execute sql task to run a stored procedure in oracle database which returns a resultset. This works. Now I need to send the ouput to a destination table in a sql database. Should I use for each loop to pick the resultset and insert it into the destination one by one (which I dont think is a great idea) or is there a better way to accomplish this task (in data flow task) ?

When I use dataflow task instead of execute sql task, the main issue is I am not able to see the output columns when I execute an oracle stored procedure, but when I see the preview I can see the resultset . But I can see the output columns for a sql server stored procedure.

View 9 Replies View Related

Integration Services :: Multiple Data Flow Tasks Within Foreach Loop Container

Nov 3, 2015

Suppose if I have a “Foreach Loop Container” that iterates over a list. Is it possible to execute different data flow tasks based on the input?

Example : List contains elements L1, L2 & L3.

ForEach Loop Container checks the input. If its L1 then it should execute DF Task1, If L2 then execute

DF Task2 and similarly for L3.

Is it possible to achieve this?

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved