DTS Error During Data Transformation

Nov 8, 2000

I tried transforming data from one server to another using DTS. and then i got an error as below,
-----------------------------------------------------------------------------
Details: Error Source: Pump Data Step
Details: The Data Transformation Services cannot copy or transform data from a Desktop or server to a standard, Enterprise, or small business server version of SQL server unless your destination server is per user licensing mode.

Facility: 4, Severity 1,Code: 1176, HRESULT:0x80040498

Desc: The Data Transformation Services cannot copy or transform data from a Desktop or server to a standard, Enterprise, or small business server version of SQL server unless your destination server is per user licensing mode.

Source : Microsoft Data Transformation Services(DTS) Package

Source : Microsoft Data Transformation Services(DTS) Package
Code: 0x80040428
Description: Package failed because step'Pump data step' failed.
Error Message: IDispatch error #552
--------------------------------------------------------------------------

This is the full description of the error dialog...

Please suggest me some solutions....

Thanks in advance,
Kamalesh D

View 2 Replies


ADVERTISEMENT

Error Source : Microsoft Data Transformation Services (DTS) Package Error Description : Error Accessing Windows Event Log

Dec 13, 2007



Hi,

I am running dts in Sql Server 2005 management studio from Management, Legacy and data Transformation Services.

Once the dts has run, I get this error message "Error Source : Microsoft Data Transformation Services (DTS) Package Error Description : Error accessing Windows Event Log."

Please help me

thanks in advance

Srinivas



View 1 Replies View Related

Transfer Data To Excel 2007 By Using SQL Server Data Transformation Services

Jun 11, 2007

My vendor requires data to be sent in Excel format.  Some of my tables have rows over 65,536 so I need to use Excel 2007 (Max of 1,048,576).  Right now my data sits in SQL 2000.  I am using MS SQL Enterprise Manager 8.0 to prepare the data.  Is there some kind of add on or selection I am missing to use DTS to export from SQL to Excel 2007?Thanks in advance. 

View 3 Replies View Related

Does A Synchronous Transformation Process All Rows In A Buffer Before Outputting To Next Transformation?

Jun 5, 2006

Hi,

If you have two synchronous transformation components and the input of the second is connected to the output of the first, does the first transformation process (loop through) all rows in the buffer before outputting these rows to the second transformation? Or does the first transformation output each individual row to the second transormation as soon as it has finished processing it?

Thanks in advance,
Lawrie.

View 5 Replies View Related

Integration Services :: Difference Between Audit Transformation And Row-count Transformation?

Apr 22, 2015

tell me the difference between Audit transformation and rowcount transformation.

Because audit and rowcount transformation will provide the environment variables.

Only difference i am finding is rowcount returns the count of rows its updating .

Apart from these is there any other difference?

Tell me the scenario where i need to use the audit transformation.

View 3 Replies View Related

Audit Transformation Error

Jun 13, 2006

Hello all,

I am getting an error that I am stuck on. I dragged the Audit Transformation for error handling (Flat File Source); however, the Audit Transformation is returning an error for one of the standard column "User name". I tried to change the length or take out this column, but the error does not go away. Below is the error message. Your help is much appreciated!

-Lawrence

"The length for "output column "User name" (195)" is not valid for the specified lineage item."

"component "Audit" (180)" failed validation and returned validation status "VS_ISBROKEN".

View 7 Replies View Related

Data Transformation

Sep 14, 2000

We are transferring data between AS/400 and SQL Server 7.0 using DTS. Some of these transfers may need to be very close to real time. It doesn't seem like a continuously running job is the best solution for that.

Do you know any tools or utilities that can help us to move the data?

Thank you,
Anastasia.

View 2 Replies View Related

Data Transformation

Feb 19, 2003

i have something like this:

select * from accounts

name type amount
==== ==== ======
mary saving 123.00
mary chequing 246.00
mary investment 135.00
john saving 678.00
john chequing 987.00
john investment 0.00

what should i do to present the data in the following format?

name saving cheq investment
==== ====== ==== ==========
mary 123.00 246.00 135.00
john 678.00 987.00 0.00


Thanks.

View 3 Replies View Related

Data Transformation

Jan 12, 2008

Hi, newbie here with a simple?(maybe)question.

I have an Access Database that I have imported into SQL Server2000 and that worked great, but now I have to get it into 2005. My question is, How can I get the tables and all info in the tables into an SQL Script so I can run that script on the 2005 server?

The SQL 2000 is on my dev server and I have all the Tools, (Ent Manager, Query Analyzer,etc...) but the 2005 Server is Godaddy's and they only have the basic web interface. I can run Sql files and create databases and tables, but thats about it.

View 2 Replies View Related

Error Output For A Destination Transformation

Jun 16, 2006

I am developing a custom destination component and I have encountered a few areas where there seems to be a lack of helpful documentation and examples.

1. I have not been able to find any information on or examples of creating custom destinations with an error output. The OLE DB Destination has an error output so I investigated the input and error output properties in the advanced editor and found that the OLE DB Destination error output is synchronous with the input (its SynchronousInputID matches the input's ID) and has its ExclusionGroup value set to 1. Using this information, I modeled my error output after the OLE DB Destination.

ProvideComponentProperties:
AddErrorOutput(ERROR_OUTPUT_NAME, input.ID, 1);

ProcessInput:
int errorOutputID = -1;
int errorOutputIndex = -1;
GetErrorOutputInfo(ref errorOutputID, ref errorOutputIndex);
...
buffer.DirectErrorRow(errorOutputID, 0, errorOutputIndex);

Checking the input and error output properties in the advanced editor for my custom destination component I find the following:
Input
-----
ID: 3515

Error Output
------------
ExclusionGroup: 1
ID: 3516
IsErrorOut: True
SynchronousInputID: 3515

Shortly after I start my SSIS package and it encounters an error row, I get the following exception:
[My Destination Adapter 1 [3512]] Error: System.ArgumentException: Value does not fall within the expected range. at Microsoft.SqlServer.Dts.Pipeline.Wrapper.IDTSBuffer90.DirectErrorRow(Int32 hRow, Int32 lOutputID, Int32 lErrorCode, Int32 lErrorColumn) at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.DirectErrorRow(Int32 outputID, Int32 errorCode, Int32 errorColumn) at MyDestination.ProcessInput(Int32 inputID, PipelineBuffer buffer) at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)


2. My custom destination component is used for writing a file with a fixed schema. I followed the means by which source component examples add their output columns, but applied this to my external metadata columns. In my Validate() I check if the ExternalMetadataColumnCollection.Count == 0 and return DTSValidationStatus.VS_NEEDSNEWMETADATA; to force a call to ReinitializeMetaData(). In ReinitializeMetaData() I call a method that creates the input's external metadata columns that reflect my external data source.

This works fine except every time I add my custom destination component to a SSIS Package and go to edit the component I am greeted with a dialog box that states: "The component is not in a valid state. ... Do you want the component to fix these errors automatically?" Pressing the Yes button, I assume, makes the call to ReinitializeMetaData() and I have my external metadata columns. Where is the correct place to add the external metadata columns so the user does not have to take this extra step every time they add my component to their package?

View 5 Replies View Related

Derived Column Transformation - Error

Aug 24, 2007

Can i call the FUNCTION within another FUNCTION

Like SUBSTRING(CHECK_NO,2,LEN(CHECK_NO) - 1) ???


I am reading the Check_No "1234321" from the flat file. The file holds all the value within double quote and values are sepearated by comma.

Objective: I am trying to elimiate the double quote using "Dervied Column'.

Strange: The above FUNCTION is working fine while construct the SQL Query.

Pls help me. Thank you.

View 5 Replies View Related

Lookup Transformation Validation Error

May 13, 2006

below is the error message:

TITLE: Package Validation Error
------------------------------

Package Validation Error

------------------------------
ADDITIONAL INFORMATION:

Error at Data Flow ACH Validate File and Header Info [Lookup ACH Batch Number [506]]: input column "ID" (571) and reference column named "BANKBATCHNBR" have incompatible data types.

"Lookup ACH Batch Number" is the name of the lookup transformation. input column "ID" has a string data type. reference column "BANKBATCHNBR" has a varchar(50) data type. can someone please tell me how i should go about debugging this validation error? thanks.

View 1 Replies View Related

Error On Derived Column Transformation

May 20, 2008


Hi all€”Given a date field called [Reading Date] and a time field called [Reading Time], I am attempting to use the following transformation on the field to test for nulls and combine the data into a single field called [Reading Date/Time]:

(DT_WSTR)([Reading Date]== "") || ISNULL([Reading Date]) || (DT_WSTR)([Reading Time]== "") || ISNULL([Reading Time])? (DT_DBDATE)GETDATE() : (DT_DATE)(SUBSTRING((DT_WSTR,8)[Reading Date],5,2) + "/" + SUBSTRING((DT_WSTR,8)[Reading Date],7,2) + "/" + SUBSTRING((DT_WSTR,8)[Reading Date],1,4) + " " + SUBSTRING((DT_WSTR,8)[Reading Date],1,8))

I get the following error:

Error at Data Flow Task [Derived Column1[177]]: Attempt to parse the expression "(DT_WSTR)([Reading Date]== "") || ISNULL([Reading Date]) || (DT_WSTR)([Reading Time]== "") || ISNULL([Reading Time])? (DT_DBDATE)GETDATE() : (DT_DATE)(SUBSTRING((DT_WSTR,8)[Reading Date],5,2) + "/" + SUBSTRING((DT_WSTR,8)[Reading Date],7,2) + "/" + SUBSTRING((DT_WSTR,8)[Reading Date],1,4) + " " + SUBSTRING((DT_WSTR,8)[Reading Date],1,8))" failed. The expression might contain an invalid token, an incomplete token, or an invalid element. It might not be well-formed or might be missing part of a required element such as a parenthesis.

Here is a sample [Reading Date]:
05/07/08

Here is a sample [Reading Time]:
19:45:48

I need to be able to handle exceptions if one or both contain NULL or no data at all. Any suggestions on how to fix this?

Thanks,
Jon

View 5 Replies View Related

Data Transformation Services

Jul 18, 2007

HiI was told that using DTS will allow me to schedule stored procedures to keep an sql database up to date. For example if a user registers but does not activate the registration, his details will be removed by a stored procedure which is scheduled to run every 24 hours. I use to use the global.asax file to fire a update by using a file containing a the date of the last update and then by adding 24 hours to it, it would execute a SP to delete unwanted data.I have tried to install DTS with no success. I am running the followingVisual web studio expressSQL 2005 Express. (From SQLExpr_exe) and I have told it to install all the extra componentsInstalled SQLEXPR_Toolkit.exe with all its optionsInstalled SQLServer2005_DTS.MSI When I go into the sql server using MS SQL Server Management Studio Express. I cannot see the Data transformation services node. I have also just installed server reports which I had no problems installing.Can somebody please help me. 

View 2 Replies View Related

SSIS Data Transformation

Jan 31, 2008

I have begun using SSIS and I am a little taken aback by the complexity of it especially since I just want to do a simple data transformation such as in DTS.
Are there any tutorials for data transformation for SSIS on the web/this forum and what if I want to do a simple transformation from Access to SQL Server?

View 1 Replies View Related

Data Transformation - Hanging

Jul 20, 2005

I'm running a DTS package on SQL Server. The source is MS Access and thetarget is Oracle.On a "Drop Table" command the process just hangs. There are no foreign keys onthe table. Several tables have already been processed successfully by thistime.I think I've ruled out corruption by dropping and recreating the targetdatabase on Oracle.Any ideas?M Man

View 1 Replies View Related

Lookup Transformation (Can It Be Using The Old Data?)

Mar 31, 2008



I have a lookup transformation that retrieves a key for a certain column of values, in this case, a name. So, I go in to the lookup table with a name and come out with its key. I had it working and then I added new entries to the lookup table for a bunch of new names. Now, for some reason, I am not getting the matches for the new names. But I am still getting the matches for the names that existed before I added the new ones.

I'm wondering if the lookup transformation is using the old set of data and some how not picking up the new names. Do I have to trigger something in the lookup transformation to let it know that the lookup table data has changed?

View 4 Replies View Related

Data Transformation Question

Sep 6, 2006

I have a student table that needs some clean up. My first task is to remove all the periods (.) from the middle name column. Some people have two char middle initials with two periods. Can this be done via a SSIS package (I sure it can, but not how)? I can't think of a simple update statement that would accomplish the same thing.

Any direction would be appreciated.

View 3 Replies View Related

Error While Using A Slowly Changing Dimension Transformation

Jun 6, 2007

I have a SSIS package which contains a number of slowly changing dimension transformations. While the majority work I have one which gives me the following error 'Error: The variable "System::LocaleID" is already on the read list. A variable may only be added once to either the read lock list or the write lock list. '. This error only occurs if the destination table holds data. If I truncate the table and reload the data then the package complete successful. The only difference I can see between this dimension transformation and the other dimension transforms is that the one in question has 2 business keys while the rest have 1.



Can anyone shed light on this?



Thanks



View 3 Replies View Related

Error Using Union All With Row Number Transformation From SQLIS(.com)

May 14, 2008

Has anyone else experienced similar problems? I've been trying to use "Row number" from sqlis.com site with "Union All" but no lucks.

[DTS.Pipeline] Error: component "Union All" (1840) failed the pre-execute phase and returned error code 0x80070057.


thanks

View 6 Replies View Related

Data Transformation Services Question.

Oct 23, 2002

I am creating a DTS package for an import of data from MSAccess97 to SQLServer2000. I am quite new to the DTS so bear with me please.

Everything was fairly simple until I got stumped by the following problem:

In the source database there was a table with multiple fields (let's say A, B, C) for each record containing their values (let's say 1,2,3).

Now, in the destination database the table is built differently. It is a table containing the field data and definition.

So basically I need to turn this

ID A B C
0 1 2 3
1 4 5 6


into this

ID FieldName FieldVal
0 A 1
0 B 2
0 C 3
1 A 4
1 B 5
1 C 6


I am not sure how to go about that... any thoughts? :confused:

View 5 Replies View Related

Transformation Component Data Store

Jun 8, 2006

i am developing one custom transfer component, where i am building one custom object and want the same to be transfered from ComponentUI to component.I explored in this issue and came to know that we can make use of SaveToXML and LoadXML methods of IDTSPersist90 interface. The problem is i could not able to make use of this interface.If any body faced same issue and got the solution, let me know the same.

Thanks in advance

Karun

View 1 Replies View Related

Transformation Components And Moving Data

Apr 10, 2007

Hi,

I am having a simple difficulty regarding my transformation component that I have created.

I followed all of the relevant documentation to create the component, UI and such, and when I run a program that transfers data through my component (without it doing anything to the data) to a flat file source, all I get in the flat file is ,,,,,,,,,,

Meaning that the data is not actually being passed through my component to the destination, but it does recongize the correct number of columns. I get this warning for each column: [DTS.Pipeline] Warning: The output column "PurchaseOrderDetailID" (20) on output "OLE DB Source Output" (11) and component "Source - PurchaseOrderDetail" (1) is not subsequently used in the Data Flow task. Removing this unused output column can increase Data Flow task performance.

To fix this issue, all I need to do is "check" the boxes in the advanced editor for my component. However, I want to be able have these boxes "checked" automatically.

My question is, when you "check" a box next to a column name in the advanced editor, what does that exactly do that allows you to transfer the data? What do I need to program in order for it to replicate that? I actually want it to happen automatically, so by default, all data in all columns are transfered through my component out the other side, so I just need to know how to do this by code, and not how to replicate the UI.

My component, I thought, did that automatically, as I specified everything that I thought was required based on all of the documentation I read. Obviously, I am missing something. Here are the methods that I believe would all be involved.

Please let me know what I am missing.

int[] inputColumnBufferIndexes; // ...
int[] outputColumnBufferIndexes; // ... used in PreExecute
PipelineBuffer outputBuffer; // used in ProcesInput

public override void OnInputPathAttached(int inputID)
{
IDTSInput90 input = ComponentMetaData.InputCollection.GetObjectByID(inputID);
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];
IDTSVirtualInput90 vInput = input.GetVirtualInput();

foreach (IDTSVirtualInputColumn90 vCol in vInput.VirtualInputColumnCollection)
{
IDTSOutputColumn90 outCol = output.OutputColumnCollection.New();
outCol.Name = vCol.Name;

outCol.SetDataTypeProperties(vCol.DataType, vCol.Length, vCol.Precision, vCol.Scale, vCol.CodePage);
}
}

public override void PreExecute()
{
//base.PreExecute();

IDTSInput90 input = ComponentMetaData.InputCollection[0];
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];

inputColumnBufferIndexes = new int[input.InputColumnCollection.Count];
outputColumnBufferIndexes = new int[output.OutputColumnCollection.Count];

for (int x = 0; x < input.InputColumnCollection.Count; x++)
{
IDTSInputColumn90 column = input.InputColumnCollection[x];
inputColumnBufferIndexes[x] = BufferManager.FindColumnByLineageID(input.Buffer, column.LineageID);
}

for (int x = 0; x < output.OutputColumnCollection.Count; x++)
{
IDTSOutputColumn90 column = output.OutputColumnCollection[x];
outputColumnBufferIndexes[x] = BufferManager.FindColumnByLineageID(output.Buffer, column.LineageID);
}
}


public override void PrimeOutput(int outputs, int[] outputIDs, PipelineBuffer[] buffers)
{
//base.PrimeOutput(outputs, outputIDs, buffers);
if (buffers.Length != 0)
{
outputBuffer = buffers[0];
}
}

public override void ProcessInput(int inputID, PipelineBuffer buffer)
{
//base.ProcessInput(inputID, buffer);

if (!buffer.EndOfRowset)
{
IDTSInput90 input = ComponentMetaData.InputCollection.GetObjectByID(inputID);
while (buffer.NextRow())
{
// TODO: Examine the columns in the current row.
// Add a row to the output buffer.
outputBuffer.AddRow();
for (int x = 0; x < inputColumnBufferIndexes.Length; x++)
{
// Copy the data from the input buffer column to the output buffer column.
outputBuffer[outputColumnBufferIndexes[x]] = buffer[inputColumnBufferIndexes[x]];
}
}
}
else
{
// EndOfRowset on the input buffer is true.
// Set EndOfRowset on the output buffer.
outputBuffer.SetEndOfRowset();
}
}

View 6 Replies View Related

CTE In OLE DB Command Data Flow Transformation

Dec 20, 2006

I am trying to use a CTE in an OLE DB Command data flow transformation object. However, when I enter the cte and corresponding query in the SqlCommand field of the OLE DB command editor dialog, I get a syntax error. Can CTE's be used data flow objects? I have been able to use them in an Execute SQL Control Flow Item, but not in any data flow item.

View 7 Replies View Related

Data Transformation Project Template

Aug 8, 2006

Does anyone know how to install or make available the Data Transformation Project Template in SQL Server 2005? I can not find it using Integration Services.

View 1 Replies View Related

SSIS Data Transformation Using Look Up Or Scripting???

Mar 6, 2008

Hi all,

I've got to change values in my source database as follows:

Source: Target:

X 1
Y 1
Z 2

Can I create a lookup table and us a look up task in SSIS to do this or do I need to script it?

Thanks

F

View 1 Replies View Related

Missing Data Transformation Project

Jul 19, 2006

If I understand correctly, SSIS is kinda the "new DTS." And to use it, if I get the gist, I am supposed to open "SQL Server Business Intelligence Development Studio," select a new project, highlight the "Business Intelligence Projects" heading, and then select the "Data Transformation Project" template. But there is just one problem.

That template is missing. It isn't there.

What am I doing wrong?

View 1 Replies View Related

Special Error Handle In A Dataflow Transformation Tasks

Jan 22, 2008

Hello,


How would you do a log in a massive rows loading, I'm having problems because every row error(because of casting, format, lookup) in a transformation task is redirected to a text file as a log, this is ok when only exist one error by row, but in the case when I have two errors in the same row detected by diferents transformation tasks only the first one is reported to the text file, I have to wait to the second information load, after I correct the first error, to find the second one, I need to validate as many errors exists by row in the same load...

which component or which strategy can I use in a SSIS Packge to achieve this?

thanks

View 1 Replies View Related

Help Needed To Troublshoot A Merge Transformation Task Error

Apr 23, 2008

I am trying to use a merge transformation task and receiving an error that I don't know how to troubleshoot further. Could I please have some advice on what else to look at to try to resolve the problem.

The error message text is: Error at Data Flow Task [Merge [1245]]: The metadata for "input column "LOCATION" (5451)" does not match the metadata for the associated output column

I have looked at the metadata and cannot see any differences: the following is output from the data flow path.
Name Data TypePrecisionScaleLengthCode PageSort Key PositionSource Component
ACCOUNT DT_STR 0 0 6 1252 1 Sort - FinSysData
PROGRAM DT_STR 0 0 6 1252 2 Sort - FinSysData
LOCATION DT_STR 0 0 6 1252 3 Sort - FinSysData
PROJECT DT_STR 0 0 6 1252 4 Sort - FinSysData
SUBPROJECTDT_STR 0 0 2 1252 5 Sort - FinSysData
ACTIVITY DT_STR 0 0 6 1252 6 Sort - FinSysData
FUNDING DT_STR 0 0 3 1252 7 Sort - FinSysData
CLIENT DT_STR 0 0 6 1252 8 Sort - FinSysData
NTWAGE DT_STR 0 0 3 1252 9 Sort - FinSysData
TYPE DT_STR 0 0 1 1252 10 Sort - FinSysData
PERIOD DT_STR 0 0 6 1252 11 Sort - FinSysData
CO DT_STR 0 0 2 1252 12 Sort - FinSysData
FIN_YEAR DT_I4 0 0 0 0 13 Sort - FinSysData
BALANCES DT_R8 0 0 0 0 14 Sort - FinSysData

Name Data TypePrecisionScaleLengthCode PageSort Key PositionSource Component
ACCOUNT DT_STR 0 0 6 1252 1 Sort - DataWarehouse
PROGRAM DT_STR 0 0 6 1252 2 Sort - DataWarehouse
LOCATION DT_STR 0 0 6 1252 3 Sort - DataWarehouse
Project DT_STR 0 0 6 1252 4 Sort - DataWarehouse
SubProjectDT_STR 0 0 2 1252 5 Sort - DataWarehouse
Activity DT_STR 0 0 6 1252 6 Sort - DataWarehouse
Funding DT_STR 0 0 3 1252 7 Sort - DataWarehouse
Client DT_STR 0 0 6 1252 8 Sort - DataWarehouse
NTWage DT_STR 0 0 3 1252 9 Sort - DataWarehouse
TYPE DT_STR 0 0 1 1252 10 Sort - DataWarehouse
Period DT_STR 0 0 6 1252 11 Sort - DataWarehouse
CO DT_STR 0 0 2 1252 12 Sort - DataWarehouse
Fin_Year DT_I4 0 0 0 0 13 Sort - DataWarehouse
Balance DT_R8 0 0 0 0 14 Sort - DataWarehouse

View 7 Replies View Related

Xml Data Transformation Into Single Data Row

May 22, 2008

Im attempting to put data from an xml file (formatting like its different tables) into a database table as a single data row entry. I've attempted to use the 'union all' and 'merge' transformation but this inserts the data as multiple rows. Is this possible ?.

View 8 Replies View Related

DTS Transformation Of Multiple Data In Single Fields

Jun 3, 2000

I have a very complex (for me, anyway) data transformation problem.

I've been given a flat-file of physician data from another system which must be automated for entry into the SQL server on a regular basis.

This was no problem until we discovered that several fields (all of which we wanted to use) had multiple pieces of information in them, separated by semicolons.

Well, this didn't seem to be too big of a problem, so I wrote a DTS activeX script to handle it. This is what I originally wrote:

'*******************
Function Transform()
'Declare variables
Dim strOffice
Dim strOfficeNew
Dim cChar
Dim x
Dim y
Dim z

'Scrub values into new rows
strOffice = DTSSource("Col050")
x = 1
y = len(strOffice)
z = 1

While x <= y
cChar = Mid(strOffice, x 1)
If cChar <> ";" Then
strOfficeNew = strOfficeNew & cChar
Else
DTSDestination("Phys_No") = DTSSource("Col001") DTSDestination("Addr_No") = z
DTSDestination("Addr_Office") = strOfficeNew
strOfficeNew = ""
z = z + 1
End If
x = x + 1
Wend

'Insert final record after last semicolon
If strOffice <> "" Then
DTSDestination("Phys_No") = DTSSource("Col001") DTSDestination("Addr_No") = z
DTSDestination("Addr_Office") = strOfficeNew
End If

Transform = DTSTransformStat_OK
End Function
'*********************

This, of course, didn't work. WHat I got was the last part of the parsed data, which for the first record, was the second Address in the field.

I searched around, and found the following script that is supposed to allow multiple rows off of a single row, but I can't seem to merge the two and still get the data out clean.

'**********
Dim nCounter
nCounter = 4

Function Main()

if nCounter > 0 then
Main = DTSTransformStat_SkipFetch
DTSDestination("PatientNumber") = DTSSource("PatientNumber") Select Case nCounter
Case 1
DTSDestination("PhysicianType") = "Admitting" DTSDestination("PhysicianId") = DTSSource
("AdmittingPhysician")
Case 2
DTSDestination("PhysicianType") = "Attending" DTSDestination("PhysicianId") = DTSSource
("AttendingPhysician")
Case 3
DTSDestination("PhysicianType") = "Referring" DTSDestination("PhysicianId") = DTSSource
("ReferringPhysician")
Case 4
DTSDestination("PhysicianType") = "Consulting" DTSDestination("PhysicianId") = DTSSource
("ConsultingPhysician")
End Select
nCounter = nCounter - 1
else
nCounter = 4
Main = DTSTransformStat_SkipInsert
end if
End Function
'**************

I'm not a VB Script expert, so there's probably something very simple that I'm missing here... if someone could point it out, I'd be greatly appreciative.

Jaysen

View 4 Replies View Related

Archiving Data Transformation Service Packages

Feb 29, 2000

Is there a way, for example to script a DTS Package, so that it can be deleted and recreated at a later date if necessary? I have quiet alot of these, but few are used regularly. The msdb database is now up to 80 MB. However I don't want to delete them and have no way to recreate them.If I took a backup of msdb and then deleted the packages, would restoring msdb at a later date restore the packages?????

View 1 Replies View Related

Data Transformation Services In SQL Server 2005

May 16, 2008

In SQL Server 2000, i'm using Data Transformation Services. What a similar DTS tools in SQL Server 2005.

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved