Changing Partial Data In Rows

May 21, 2008

I moved a database from one server to another, in this database are references to a UNC path on the server. now I need to find a way to modify the server name that was in each row to reflect the new server name and path, ex:

\server1path1data to now say
\server2path2data

how do I selectivly modify just some of the data? or do I have to basically re-write the entire column for each row?

View 1 Replies


ADVERTISEMENT

SQL: Removing Partial Duplicate Rows

Dec 24, 2007

Hi,
 I have results from a survey in a table, every entry is assigned a unique ID. I want to remove duplicate entries based on the survey data but not on the unique ID (obviously).
So far I have...SELECT DISTINCT RespondantID, Q1, Q2, Q3, Q4, Q5, Q6, Q7, Q8, Q9, Q10, Q11, Q12, Q13
FROM Results
But that gives...




1 - Anonymous
1
1
1
1
1
1
1
1
1
1
1
1
1

2 - Anonymous
2
2
2
2
2
2
2
2
2
2
2
2
2

3 - Martin
2
2
2
2
2
2
2
2
2
2
2
2
2
I.e. in the above example, it would seem that 'Martin' submitted his data once and then submitted it again with his name.
 How can I remove the '2 - Anonymous' frrom the data set?
 Thanks!

View 8 Replies View Related

Replace Partial Data On A Colum!!

Aug 29, 2006

Hello,

I have a table in witch i would like to replace all the data form one columm following a criteria.

The criteria is:

Where 1010222222 will be 0020222222

So i want to make a script that enables me to substitute all the data in that columm that begin by "101" for "002"

Any help?

Thanks

View 6 Replies View Related

Exclude Data From Partial String In Predicate

May 8, 2015

I have to write one query to exclude data from partial string in the predicate.

Select * from table where col NOT in ('%abc%,%xyz%).

So the data where col value doesn't contain the ABC and xyz string value only should be selected. I am also getting xyz data in the result.

View 4 Replies View Related

Data Warehousing :: How To Increase Partial Cache Size On Lookup Stuck

Apr 20, 2015

After converting from SSIS 2008 to SSIS 2012, I am facing major performance slowdown while loading fact data.When we used 2008 - one file used to take around 2 hours average and now after converting to 2012 - it took 17 hours to load one file. This is the current scenario: We load data into Staging and Select everything from Staging (28 million rows) and use a lookups for each dimension. I believe it is taking very long time due to one Dimension table which has (89 million rows). 

With the lookup, we currently are using partial cache because full cache caused system out of memory.Lookup Transformation Editor - on this lookup - how to increase the size on partial Cache size 64-bit? I am being stuck at 4096 MB and can not increase it. In 2008, I had 200,000 MB partial cache size.

View 2 Replies View Related

Slowly Changing Dimension With 600,000 Rows

Jan 17, 2006

Hi,

We have been using tasks generated from the SCD wizard. We have smaller dimensions (< 30,000 rows) that work well. Our Product Dimension package is giving us performance problems (taking 7 hours to do 600,000 rows when 80,000 records are updated; the rest new inserts). It is similar to the smaller dimensions. Several columns are type 1 and are doing update statements; several are type 2 doing updates and inserts. The package had a complicated view as the initial task, but we have since modified to use a SQL command with variable and now the initial read appears quick, but is chunking in 10,000 record increments and taking the 7 hours (never let finish previously). So the package is pretty basic now (reading a source, a small derive and data conversion, a small lookup (cached 30,000 records) for a description, then the SCD). Before I start replacing what the SCD generates with stored procedures, anyone have any suggestions as to what might be the issue? We believe we have increased the number of type 2 columns and the SCD definately has more to do than just an insert or update, but 7 hours for 600,000 records seems excessive. Interestingly, the source task never turns green. Previously when we had a Merge Join it completed the read and bottlenecked at a sort and a Merge Join. Now that has been removed and simplified, and all tasks remain yellow with the 10,000 (actually 9,990 I think) chunks appearing at the source, then the SCD before the next chunk appears to be read. On the general release (not the beta). Thanks in advance!

View 3 Replies View Related

Inserting Values Into SQL CE 3.5 Ntext Field Is Changing All Rows...

Apr 25, 2008



Has anyone seen this issue before? We are running a SQL CE 3.5 database on a windows desktop. A couple of our tables have ntext fields. When we do an insert the statement updates the value for all rows, not just the one that was added. I can easily repro this with some of the online samples too. Try the following:


SqlCeConnection conn = new SqlCeConnection(_sConn);

conn.Open();

SqlCeCommand cmd = conn.CreateCommand();

cmd.CommandText = "CREATE TABLE BlobTable(name nvarchar(128), blob ntext);";

cmd.ExecuteNonQuery();

cmd.CommandText = "INSERT INTO BlobTable(name, blob) VALUES (@name, @blob);";

SqlCeParameter paramName = cmd.Parameters.Add("name", SqlDbType.NVarChar, 128);

SqlCeParameter paramBlob = cmd.Parameters.Add("blob", SqlDbType.NText);

paramName.Value = "Name1";

paramBlob.Value = "Name1 Memo";

cmd.ExecuteNonQuery();


cmd.CommandText = "INSERT INTO BlobTable(name, blob) VALUES (@name, @blob);";

SqlCeParameter paramName = cmd.Parameters.Add("name", SqlDbType.NVarChar, 128);

SqlCeParameter paramBlob = cmd.Parameters.Add("blob", SqlDbType.NText);

paramName.Value = "Name2";

paramBlob.Value = "Name2 Memo";

cmd.ExecuteNonQuery();



After the second execution the blob column in both rows will have the value 'Name2 Memo'.

This is obviously a huge problem for us and would appreciate it if someone can explain what is happening. Seems like a bug but would like to be certain before I go the support route.


View 1 Replies View Related

Slowly Changing Dimension - Adding New Rows When No Change

Apr 16, 2007

Hi , I am using the Integration Services slowly changing dimension to move data from a SQL Server 2000 database table to a SQL Server 2005 table.

The problem is the package is not tracking changes, it is spending a lot of time doing lookups (i'm guessing this cost it's real slow), but ends up creating new records when there has not been a change.

I'm quite sure the business key is set up correctly (I'm using the PK from the source table).

The database I am transferring from has non Unicode data types (ie varchar and char) and the destination database has Unicode data types (ie nvarchar).

Also some of the fields in the dB are NULL - does this have an effect (ie one null doesn't equal another null)? Or shouldn't that matter?

View 4 Replies View Related

Slowly Changing Dimension - Always Shows Rows As Changed?!

Aug 29, 2006

I created a simple type 1 slowly changing dimension, setting all the columns to "Changing Attibute". The first time I run the package it sees all rows as new and imports them into the dim as it should. Next time I run it put 100% of the rows into the "Changing Attribute Updates" and runs an update on all 90,000 rows - updating the rows to exactly what they were before

If I take the DIM and the Source in SQL and join on every row the join succeeds (meaning the rows match perfectly).

Shouldn't the SCD object just ignore the rows if they match? Or does it assume that ALL incoming rows are either new or changed? (if so why is there an output called "Unchanged Output"?). Is there some "gotcha" I am missing??

Thanks

Chris

View 5 Replies View Related

AS 2005 Slowly Changing Dimension - Adding New Rows When No Change

Apr 15, 2007

Hi , I am using the Integration Services slowly changing dimension to move data from a SQL Server 2000 database table to a SQL Server 2005 table.

The other problem is the package is not tracking changes it is spending a lot of time doing lookups (it's slow), but ends up creating new records when there has not been a change.

I'm quite sure the business key is set up correctly (I'm using the PK from the source table).

The database I am transferring from has non Unicode data types (ie varchar and char) and the destination database has Unicode data types (ie nvarchar).

Also some of the fields in the dB are NULL - does this have an effect (ie one null doesn't equal another null)? Or shouldn't that matter?

View 1 Replies View Related

SSIS - Slowly Changing Dimension - Detect Deleted Rows From The Source

Nov 22, 2007



Hi all,

Can you help me to resolve my problem ? I have to do a simple daily backup system. Source : Flat File; Destination : SQL Server. I want to use the Slowly Changing Dimension component to backup only the new and updated row from my source (Flat File) and put them into SQL Server.

But how can I do to detect deleted rows from my source ?

Any suggestions ?

If it's not clear enough, please ask for more details !

GO

View 3 Replies View Related

Changing Header Rows To Skip Property In Flat File Connection During Runtime

Dec 21, 2006

Hi all

I have a flat file.I am trying to set the value for the property "HeaderRowsToSkip" during runtime.I have set an expression for this in my "flat file connection manager". But this is not working.The connection manager is not able to take the value during runtime.

My expression is as follows:

DataRowsToSkip : @[user:: Var]

where "Var" is my variable which gets the value from the rowcount component and trying to set it back to the "HeaderRowsToskip" property.

I ve even tried setting the value to the "HeaderRowsToSkip" property in the expression builder.

Its not working....

Can anyone help me out in solving this????

Thanks in advance

Regards

Suganya

View 22 Replies View Related

Transform To Remove Rows From Data Set A That Match Rows In Data Set B On A Given Key?

Jun 28, 2006

Hi,

I have a common requirement in numerous SSIS processes to take my main input data set and to remove all rows from it that match a second input data set on a given key and output this as the main output. I also want to output (as a second output) all the rows from the main input data set that did match on the given key. However, I don't want to merge in data from the second input, nor am I interested in rows from the second input data set that have no match in the main input.

E.g. If I have the following data:

Main input:
Key Name
--- ----
1 Steve
2 Jamie
3 Donald

Second Input
Key DontCareAboutThisField1
--- -----------------------
1 ...
3 ...
4 ...

Then I would like the following output:

Main Output
Key Name
--- ----
2 Jamie

Second Output
Key Name
--- ----
1 Steve
3 Donald

Can I do this with a standard transform, or will I have to write my own? Any help on this would be greatly appreciated!

Thanks in advance,

Lawrie

View 1 Replies View Related

Do We Have To Alawys Use Slowly Changing Dimensions (SCD) Component In The Data Flow For The Loading Of Table Data?

Feb 28, 2008

Hi, all experts here,
Do we always have to use SCD component for the loading of data into data warehouse to handle changes of rows?
I am looking forward to hearing from you and thank you very much in advance for your help.
With best regards,

View 4 Replies View Related

T-SQL (SS2K8) :: Load Data From Flat File Source Into OleDB Destination By Changing Data Types In SSIS

Apr 16, 2014

I have an source file and i have to load it into the data base by changing datatype of the columns in ssis

View 1 Replies View Related

Arranging Data On Multiple Rows Into A Sigle Row (converting Rows Into Columns)

Dec 25, 2005

Hello,
I have a survey (30 questions) application in a SQL server db. The application uses several relational tables. The results are arranged so that each answer is on a seperate row:
user1   answer1user1   answer2user1   answer3user2   answer1user2   answer2user2   answer3
For statistical analysis I need to transfer the results to an Excel spreadsheet (for later use in SPSS). In the spreadsheet I need the results to appear so that each user will be on a single row with all of that user's answers on that single row (A column for each answer):
user1   answer1   answer2   answer3user2   answer1   answer2   answer3
How can this be done? How can all answers of a user appear on a single row
Thanx,Danny.

View 1 Replies View Related

Using SSIS 2005 To Strip Out Bad Rows In Excel And Then Insert Detailed Rows Into OLE DB Data Source

Apr 6, 2006

Environment:
 
Running this code on my PC via VS 2005
.Net version 2.0.50727 on the server (shown in IIS)
Code is in ASP.NET 2.0 and is a VB.NET Console application
SSIS 2005
 
Problem & Info:
 
I am bringing in an Excel file.  I need to first strip out any non-detail rows such as the breaks you see with totals and what not.  I should in the end have only detail rows left before I start moving them into my SQL Table.  I'm not sure how to first strip this information out in SSIS specfically how down to the right component and how to actually code the component to do this based on my Excel file here: http://www.webfound.net/excelfile.xls

Then, I assume I just use a Flat File Source coponent or something to actually take the columns in the Excel and split into an OLE DB Datasource to shove each column into a corresponding column in my SQL Server Table.  I have used a Flat File Source in the past to do so with a comma delimited txt file but never tried with an Excel.
 
Desired Help:

 
How to perform
 
1)       stripping out all undesired rows
2)       importing each column into sql table

View 1 Replies View Related

Integration Services :: Data Flow Task Failed After Loading 29000 Rows Out Of 234567 Rows

Oct 13, 2015

I am facing an issue that Data flow task failing after loading 29000 rows out of 2lakhs rows.

I am loading data from .csv file to OLE DB Destination.

This data flow task is placed inside For each loop container.

is this issue because of any performance issue in SSIS packages such as buffer size.

find the error below:

DFT Load Data from FlatFile:Error: The conditional operation failed.
DFT Load Data from FlatFile:Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. 

The "DER Add Calc Columns" failed because error code 0xC0049063 occurred, and the error row disposition on "DER Add Calc Columns.Outputs[Derived Column Output].Columns[M_VALUE_NUM]" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.

DFT Load Data from FlatFile:Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "DER Add Calc Columns" (48) failed with error code 0xC0209029 while processing input "Derived Column Input" (49). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.

[code]....

View 8 Replies View Related

Data Unexpectedly Changing In Data Flow Task

Oct 3, 2007

I have a fairly simple data flow task that loads data from one table (OLE DB Source) into another table (OLE DB Destination). The data type for one of the pairs of columns is nVarChar(120) and it contains version information that looks like a decimal. When I run the export, the destination has a trailing zero added after the decimal point as if it were a numeric column which invalidates our comparisons (string 1.0 is not the same as string 1.00). There is no cast or convert done to this column, it is a straight copy. Any ideas what could be causing this or how to fix?

View 6 Replies View Related

Changing Data Source Programatically Does Not Transfer Data

Aug 29, 2006

I have created a SSIS package that transfer data from a Foxpro database to an instance of SQL Server 2005 Express. I used the wizard to create the package but I load and execute the package within a custom application that I have written in C#.

The way the custom application is intended to work is that the user can have the database in any location on the computer and all he has to do is specify the location then the application programatically changes the location of the source on the package that it has loaded and then execute it. When I initially run the package the first time (using the original path), it works fine and transfers the data. However, every subsequent time I run the application and specify a different path, the database on the SQL Server side gets created as expected but the data is not transfered!

Where am I going wrong? Do I need to save the package after I modify the source then reload and run it again or do i need to change something else in the Data Flow to make this work?

View 4 Replies View Related

Changing Data In SQL

Jul 20, 2005

First of all I am not familiar with SQL so bear with me.I have a field called stock_code, the data in that field, looks likethis000000851. I need to replace only the first two charachters '00' with'DR'.If I use the replace function, it will of course replace everyoccurance of 00 which I dont want.Apart from exporting the data out to excel and changing it thereThanksAlan

View 3 Replies View Related

How To Modify Some Data Without Changing Everything

Oct 17, 2002

Please HELP!

I need to UPDATE a column by removing only the first occurance of $$sp;. I use the following to get an idea of what I have:

SELECT Reporting_Title_Html FROM Lab_Test Where RTRIM(Reporting_Title_Html)='$$sp;'
Reporting_Title_Html
------------------------------
$$sp;
$$sp;
$$sp;
$$sp;
$$sp;Thyroid maintenance required;$$sp..........

Get 5 Records....One record has multiple occurances of $$sp throughout the Reporting_Title_Html column.

I thought I could do:
Update Lab_Test Set Reporting_Title_Html=' ' Where RTRIM(Reporting_Title_Html)='$$sp;'

But I can't lose the trailing data from record #5. This is just a small sampling of what I'm trying to fix.

Any Ideas???

I've thought about REPLACE but that would replace all the $$sp's and I only want to change the very first one from $$sp to ' ' --> blank

View 1 Replies View Related

Changing Data Types----Help!

Dec 27, 2006

How do I go from a text data type to a numeric data type without having to delete the column and put it back out there. I tried to changed the data type but it gave me a error saying that I couldn't go from text to int data type. Help.

View 4 Replies View Related

Changing Data-types

Sep 4, 2007

Hi All,I have a varchar(255) column for storing text in english and inhebrew.It was a stupid mistake to set the data type to be varchar, becausenow I need to store text in german and francais too.Question: Is it safe to change the data type from varchar to nvarchar,without damaging the data that's already present?(I have more than 1000000 records stored...)Thanks,Danny

View 4 Replies View Related

Visualization Of Changing Data

Nov 13, 2007

Hello,
I'm working on a distributed application which requires data from a central database to be shown to various users. In particular users should be able to open "views" on the database, specifying the data they are interested in, and being shown the result.

This can be easily accomplished modeling the view specifications as SQL select queries, sending those queries to the server, and visualizing the result. This approach is very flexible, supported, fast and definitely the way we would like to implement it.

The problem is that we have an additional requirement.
In general the data the users might be interested in is not static, and when that data changes, those changes should be propagated to the client views.

There are lots of solutions and technologies aimed to solve very similar problems, but no one seems to support this scenario well.

1. You can re-run the queries every x time (polling)... but this is hardly an option when you have hundreds of client-views that require data to be updated in real-time (so "x time" should be, say, 1 second).

2. You can use query notification... but the documentation explicitily says it doesn't scale well and shouldn't be used with a large number of clients (which is exactly what is needed in this case).

3. You can use replication services... but this technology doesn't really seem to be created to support this scenario.

4. You can use notification services... but even this technology seems to be focused on different scenarios. I haven't looked at it extensively anyway.

5. You can use DML triggers to track data changes... but you still don't have the publish-subscribe mechanism to register client views and the mechanism to know which client views are affected by the changes.

6. You can implement a business logic tier that relates modification requests to client views... and basically do everything from scratch without using existing services.

What are your opinions?


Thank you,

Lorenzo Castelli

View 2 Replies View Related

Changing Data Type

May 21, 2006

Use databases a bit, but new to SQL Server. We just want to change a column of existing SQL Server 2005 data from a string data type to one of the UNiCODE data types, such as DT_WSTR or DT.NTEXT (such as one can use for various data mining tasks, etc.). It seems to do this one needs to "the data conversion transformation editor". To use that one has to have a package and a project?

Does any one have a full script or set of steps to do the full set of steps for what should be a simple task? This would be a great example for BOL, but each atomistic bit of BOL refers to another, and one gets lost in the circle when a complete example is needed for fundamantal housekeeping tasks.

View 6 Replies View Related

What Happens With Two Transactions Changing Same Row Data

Sep 23, 2015

If I have an ACID-based SQL engine and I run a long-running transaction that changes a record at some point during the transaction, and at the same time it's running another transaction changes a record that the long running transaction has/will change/d, will they both complete or will one transaction fail? If they both complete who wins the final state of the record? The long running one or the other one? (is it based on when the transaction actually started "point in time" or is it specific to when the record was changed during the transaction "point in time"?)

View 5 Replies View Related

Return Two Rows From One Rows Data

Jul 20, 2005

I know this table is designed wrong for what I am doing but I hope Ican do it. I have a table like this.Prod_A_Jan, Prod_A_Feb, Prod_B_Jan, Prod_B_FebI want a query that returns data like this (two rows of data)"ProdA", Prod_A_Jan, Prod_A_Feb"ProdB", Prod_B_Jan, Prod_B_FebI know two queries can get it but I want one. Any Help would begreat!!!Sheila T.

View 3 Replies View Related

Changing Column Data Type

Nov 23, 2007

I want to change a column's data type from bit to int.  There are data in the table already.  I'm wondering if it is save/correct way to issue the following command to change the data type for that column.ALTER TABLE database_tableALTER COLUMN my_bit_column INT; Thanks.

View 1 Replies View Related

Changing Format Of Data In Gridview Using VB

Mar 24, 2008

Hello all,
I am trying to modify the output of a SELECT statement in a VB asp.net page that pulls data from a SQL DB. I only need to modify one column in the gridview, but I'm having some issues.
Here is the situation:SELECT     TOP 24 ID, RecID, Timestamp, Answered, Holds, Dropped, Waits, Voicemail, Busied, LiveWaitsFROM         mTrafficORDER BY Timestamp DESC
The Timestamp column returns a number value (i.e. 564566).In terms of a date, 564566 doesn't mean anything to the user, so I need to convert this number into a recognizable and accurate date.
The formula I need to implement is Timestamp/1440 - 1
The following SELECT statement returns the number as a date, but makes all records the same date:
SELECT     TOP 24 ID, RecID, Convert(datetime, Timestamp/1440 + 1) as Timestamp, Answered, Holds, Dropped, Waits, Voicemail, Busied, LiveWaitsFROM         mTrafficORDER BY Timestamp DESC
What do I need to do to implement this function dynamically into my gridview?
Thanks in advance!
 
Chris

View 2 Replies View Related

Changing Path Of Data && Log Files On The Fly

Dec 4, 2004

We have a SQL Server setup as a publisher to 15 subscribers. We need to change the path of the data & log files to a new drive (added a new harddisk). We plan to take a cold backup of the database and shift the data & log files to the new drive. Then we just attach the data & log files from the new path.

Will this disturb my existing replication Setup?
Is the the correct procedure for changing the path of the existing data & log files?
What is the appropriate method for shifting data & log file of a live database to a different location (directory/drive) ?

thanks in advance

View 2 Replies View Related

Changing The Structure Of Data In A Table

Feb 20, 2004

I have a table that looks like this:

ID Type
123 Phone
123 Meeting
123 Phone


and I would like the data to look like this

ID phone Meeting
123 2 1


How do I do this?

View 3 Replies View Related

Historical Reporting On Changing Data

Apr 27, 2008

I've got a customer who wants reproducible/historical reporting. The problem is that the underlying data changes.

I tried to explain that this can't be done (can it?), but he doesn't
understand.

To illustrate the situation - Let's say a teacher wants to track
spelling test scores for her students.
The below are scores for students A, B, and C (for January, February, March)

A: {70,80,85}
B: {70,65, 80}
C: {100,90,100}

So, I can generate a historical report that charts the class average
and student trend - that's pretty easy.

Now, in April, we find that the school board has mandated that the
British spelling of words is ok, so now the cumulative scores (for
January, February, March, April)

A: {90,80,85,100}
B: {80,65, 80,80}
C: {100,90,100,75}

He wants a report showing the January average as (70+70+100)/3 = 80,
when really it is (90+80+100)/3 = 90.

Now imagine that there are actually thousands of data points changing like this...
Now also imagine that we add and remove students on a regular basis...

He and his office manager get frustrated when I explain that the
reports are not simple - in their mind it is. They have determined
the solution is to get a report writer and buy Crystal Reports...
I've tried to explain that the problem is that the report
specification is unclear (basically - they don't understand what they want). The situation is ok for now, I'm just trying to plan for when they figure out that buying Crystal Reports won't change their situation (except they are done several thousand dollars)...

Any tips?

View 20 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved