Help On Updating 1.3 Million Rows On The Production Server

May 4, 2000

I need to update about 1.3 million rows in a table of mine.
I am getting the data from one of the columns of the same table and
updating the new column.
I am doing this using a cursor which I have put in a stored procedure.
As this is a production table which users might be accessing.It is a
web based application and I can't slow the system down.
So I am willing to run the stored prcedure during off peak hours.
However, do I need to put this in a transaction?
If I did put it in a transaction what type of isolation level should I
opt for?
Data integrity is very important for me and I don't mind to compromise
on the performance.
I am doing this because one of the columns which has "short description"
entry is has become too small for business purposes and we want to increase it's
length from varchar(100) to varchar(150).
As this is SQL 6.5, I can't increase the lenght of the column.
So Iadded a new column and will run the stored proc.
What precautions are to be taken?
This is on a high priority basis and very important too.

Thanks in advance...

Stored procedure code:

USE DB_Registration_Dev
GO
IF EXISTS (SELECT NAME FROM SYSOBJECTS WHERE NAME='usp_update_product' AND TYPE='P')
DROP PROCEDURE usp_update_product
GO
CREATE PROC usp_update_product
AS
DECLARE @short_desc varchar(100)
DECLARE @prod_id int

DECLARE sdesc_curs CURSOR
FOR
SELECT [Product].[product_id] , [Product].[short_description]
FROM Product

OPEN sdesc_curs

FETCH NEXT FROM sdesc_curs
INTO @prod_id, @short_desc

WHILE @@FETCH_STATUS = 0
BEGIN
UPDATE Product
SET [Product].[sdesc] = @short_desc
WHERE Product_id=@prod_id
FETCH NEXT FROM sdesc_curs
INTO @prod_id, @short_desc
IF @@FETCH_STATUS <> 0
PRINT ' Finished Updating the table...go ahead and have fun ...! '
END
DEALLOCATE sdesc_curs
GO

View 1 Replies


ADVERTISEMENT

Updating A Column In A Table That Contains 50 Million Rows

Feb 27, 2008

I'm looking for some performance assistance on updating a column value in a table that contains approximately 50 million rows. I have a permanent table in another database that has the key column and value to be set. My query is listed below, but I'm afraid it will run quite awhile. Any suggestions would be appreciated.

update mytable
set column2 = b.column2
from mytable as a
join mytable1 as b
on a.column1 = b.column1



There is a one to one relationship between the two tables.

View 8 Replies View Related

SQL Server 2012 :: Updating 25 Million Records In Batches

Nov 10, 2014

I have 2 tables with this schema

CREATE TABLE tableValues(
[LASTENCRYPTIONDT] [datetime] NULL,
[ENCRYPTIONID] [int] NULL,
[NAME] [varchar](50) NULL

[Code] ....

I want to update tableToUpdate in batches of 5000 per batch and set the lastenecryptionDT to null based on the the join to the tableValues using the column ENCRYPTIONID, and also output updated rows into another table. Incase I would need to do a rollback.

View 3 Replies View Related

End 35 Million Conversations Quickly - Production Issue!

Dec 15, 2006

We have a system that has 35 million conversations piled up. We didn't know to explicitly end the conversation once the processing has completed. Oops. Now, our production box has 35 mm sitting in the table, and we have run into the problem where the amount in sys.conversation_endpoints has exceeded memory and they are being dumped into tempdb, which is killing our disk space, thus bringing the box down. We have fixed the code to end the conversations, but we now have to end the conversations in a hurry. If we select one by one out of the table and end the conversation via END CONVERSATION, it is slow. Very slow. It will finish in a few months. :(

Does anyone know how to get rid of these conversations in a hurry? All of the messages have been applied to our system, so killing the conversations will (should) have no affect on the processed data. Something like a TRUNCATE statement?

Thank you so much in advance,

John Hennesey



View 5 Replies View Related

SQL Server 2012 :: Update Statement Will Not Update Data Beyond 7 Million Plus Rows Out Of 38 Millions Rows

Dec 12, 2014

I run the following statement and it will not update beyond 7 million plus rows and I have about 38 million to complete. I keep checking updated row counts and after 1/2 day it's still the same so I know something is wrong because it was rolling through no problem when I initiated it. I need to complete ASAP so it's adding to my frustration. The 'Acct_Num_CH' field is an encrypted field (fyi).

SET rowcount 10000
UPDATE [dbo].[CC_Info_T]
SET [Acct_Num_CH] = 'ayIWt6C8sgimC6t61EJ9d8BB3+bfIZ8v'
WHERE [Acct_Num_CH] IS NOT NULL
WHILE @@ROWCOUNT > 0
BEGIN
SET rowcount 10000
UPDATE [dbo].[CC_Info_T]
SET [Acct_Num_CH] = 'ayIWt6C8sgimC6t61EJ9d8BB3+bfIZ8v'
WHERE [Acct_Num_CH] IS NOT NULL
END
SET rowcount 0

View 5 Replies View Related

Updating An ASPNETDB On The Production Server

Jan 29, 2007

Group,
 I have an ASP.NET web application using the default ASPNETDB that I did in VWD 2005, and deployed to a production server (IIS).
 I need to make a table addition to the production database (ASPNETDB).  I'm trying to use SQL Management Studio to attache the database, but it keeps saying the database is locked by a process.  I shut down all the services that would likely be using it, but it's still showing locked.  What do I need to shut down to attach the database to make the change?
 Or is there a better way to do it?

View 1 Replies View Related

SQL Server 2014 :: Insert 500 Million Rows Into In-memory Table

Jul 29, 2014

I am doing a performance testing for In-memory option is sql server 2014. As a part I want to insert 500 million rows of records into a in-memory enabled test table I have created.

I need a sample script to insert 500 million records into a table ....

View 9 Replies View Related

Updating 4 Million Records

Aug 30, 2006

Meg writes "Hi,

I have a table that has 4+ million records. I need to update those records. I am facing some performance issue. Can someone please advice?

update stage
set batch_status = 1
where update_status = 0


Update transaction
Set aId = s.aId,
b = s.b,

from stage s
Where s.aId = transaction.aId
and s.batch_status = 1


Update stage
Set update_status = 1,
batch_status = 2

where

batch_status = 1

When I run the above query with "set rowcount 1000", it runs in one minute. When I run the query for "set rowcount 10000", it runs in 1 hour 56 minutes. Can someone help me to optimize it?

Thanks.
Meg"

View 4 Replies View Related

Transact SQL :: Updating A Table With 45 Million Records

Jul 21, 2015

I am trying to update a large table which consists of 45 million records , it is taking more than 2 days to the update , below is my approach

1. The table has only one clustered index and no other indexes on the table.
2. I am updating in batches say 20000 record-wise.
3. Changed the recovery mode to bulk logged and auto-growth size is set to  300MB and there is enough space in my disk for transaction log .

But still the query is running slowly.

View 10 Replies View Related

500 Million Rows Of Data?

Apr 9, 2008

I'm new to using a DB and have a few questions about what I'm trying to do. I have some historical options data and want to place it into a sql express database. (I understand I might need to use a none express version once the db gets to big.) A months worth of data is over 5.5 million rows of data. So six years worth is ~400 million rows. Is it possible to put this into a sql db and be able to search it very fast? I have a months worth in a db now and it is pretty slow. Should I use a new table for each month and then have 6 years * 12 month = 72 tables to increase the search speed? I search by date and stock_symbol and the data looks like this:
 Date, Stock_Symbol, Option_Symbol, Strike, BidPrice, AskPrice, Volume, OpenInterest, (and a few others)
The select statement is simple: SELECT * FROM Options WHERE Date = @Date and StockSymbol = @Symbol
Thanks

View 4 Replies View Related

Copying 4 Million Rows Everyday

Mar 21, 2000

In our database, we have a very large table that gets updated every morning, start of the day is copying 4 million rows from the fact table from previous date to today's date in the same table and then some other processing. It takes 1 1/2 to 2 hrs to do this. There is a dts package created to copy these rows into temp table and then to this fact table.

This table has more than 200 million rows

Any ideas on how to accomplish this without doing the copy twice and not running into locking problems.

Thanks for any suggestions.

View 5 Replies View Related

Deleting 8 Million Rows From Database

Jul 16, 2013

i am deleating 8 Million rows from my database,I am wondering how to control T-Log,also I heard something about row lock and table lock

View 4 Replies View Related

Query Optimization Having 10 Million Rows

Feb 27, 2015

i have a following table

table name : emp_master

empid efname emname elamane efathername emothername deptno edob edoj createdby updateby lastupdatedatetime lastactionperformed

empid is primarykey.

this table contains 20million of records and i want to fire following query on this to get employye all data where eployee is more than 10 year old

select empid ,efname, emname, elamane, efathername, emothername, deptno ,edob ,edoj ,createdby, updateby, lastupdatedatetime ,lastactionperformed
from emp_master
where year(doj)+10 > year(getadate())

this will return approx 10 million rows and taking 18 mins. tune this query what approaches should i take to reduce the time of execution.

View 6 Replies View Related

Which Task Is Best For Loading 1 Million Rows

Mar 24, 2008

Hi,

i have to load 1million rows from database( or flatfile) to the database(or flat file).
which task is used as the best solution for this?

Appreciate any assistance in this regard.

Thanks,
Das

View 5 Replies View Related

SQL Server 2012 :: Copy A Table With 200 Million Rows To Another Table On Same Server

Aug 11, 2014

I need to use Bulk insert statement for copying a table with 200 million rows to another table on the same server...the table has no primary key or identity column.... script for BULK INSERT ...

View 9 Replies View Related

Missing 1 Million Rows During BCP PROCESS!! Urgent Help

Mar 12, 1999

Hi, I used the /e in my bcp code. yet did not get all the rows from the main frame into the sql talbes... here is the case I have 11 million rows in an ftp server I use this code to bcp into sql server can anyonecheck if this code is good for the process, I am missing one million row in the bcp process and do not know why??? I put the /e to see if there is any error but could not see any error file in my hard drive?
Please check it out and let me know

regards
Ali


Exec master..xp_cmdshell "bcp dbname..tablename in c:ftprootNbtorder.txt /fd:ftprootformatfileablename.fmt /Servername /Usa /Password /b250000 /a8000 /eerrfileORD"

View 2 Replies View Related

DataWarehouse - 140 Million Rows - Load Performance Issue

Mar 10, 2000

Need help, I am managing a Data Warehouse (80 G.B Database Size), I purge older than 6 months data from a table which has more than 140 Million rows on daily basis. The daily data load performance is degrading. The table has no clustered indexes (only non-clustered indexes).

Tried dropping and rebuilding the non-clustered indexes, didn't work.

One way to solve the problem is drop the non-clustered index, bcp out the data, truncate the table and bcp in the data and rebuild the non-clustered indexes. This is too risky and taking 14 hours to bcp out the data.

This was not the issue in SQL Server 6.5, because SQL 6.5 always insert new record indexes at the end of the heap link (heap = non-clustered indexes without clustered index). In contrast, SQL Server 7.0 first checks for available space in existing pages by using percent free space pages (this is where it is killing the performance ).

Thanks for your help!

View 4 Replies View Related

Checking To See If A Records Exists Before Inserting - 3 Million + Rows

Aug 21, 2007

I have 1+ CSV files (using a foreach loop) which I'm doing a lot of transform work on and then inserting into a SQL database table.
Each CSV file usually contains about 2 days worth of data (contains date stamps) - somewhere in the region of 60k records per day.
The destination table currently contains 3 million+ rows and will get bigger.
I need to make sure that before inserting into the destination table, the data doesn't already exist.

I've read the following article: http://www.sqlis.com/311.aspx
While the lookup method works, it takes ages and eats up memory as it caches the 3m+ records before running for each CSV. Obviously this will only get worse as the table grows in size.

To make things a little more efficient what I'd like to do, is first derive the dates I'm dealing with in the current file - essentially storing the max(date) and min(date) in variables. Then in the lookup SQL use those vars, to reduce the amount of data that needs to be brought into the transformation to check against before inserting into the destination table.
Lookup SQL eg. SELECT * FROM MyTable WHERE Date BETWEEN varMinDate AND varMaxDate.

Ideally I'd use an aggregate transformation and then use the subsequent output from that either in the lookup query or store the output in vars, but I don't think you can do that and I get the feeling I'm approaching this with the wrong mindset.

Any thoughts would be great!

View 6 Replies View Related

How Do You Handle Diagrams And Updating To Production?

Apr 29, 2008

I am using SQL Server 2005 Express and the table diagrams and everything works great locally.  However, pretty soon I'll have to create the same database structure on the production server (hosted remotely at DiscountASP.NET) and I'm not quite sure what's the best way to go about doing that.  I only need to copy over the data for a few tables (ones with enumerated values, for instance), but the rest I only need to create the table structure (what's the point in copying over test users for instance).Also, any future modifications to the database structure will need to be updated to the server along with code updates.  How do you all handle this?  Do you keep an Excel file of database modifications that you run when you perform an update?  Or is there an easier way?Thanks! 

View 2 Replies View Related

DB Engine :: Deleting 1 Million Records From Transaction Table Of 10 Million Data On 24/7 Environment

Jun 12, 2015

I have a requirement to delete 1 Million records from a table having 10 Million data and it's being queried on 24/7 basis (don't have a downtime). how can I achieve that?

View 13 Replies View Related

Compare Int To Varchar, Import 1.5 Million Rows (was 2 Simple Problem)

Feb 1, 2005

1) I write a select statemnet

select a.columname1, b.columname1
from table1 a,table2 b
where a.columname2 = b. columname3


How do i compare
a.columname2 <--> int type column
while b. columname3 <--->varchar type

how should i use convert function

2) whats the best way to import 1.5 million rows ?
How about text file

View 2 Replies View Related

Transact SQL :: Query To Update A Table With More Than 150 Million Rows Of Data?

Sep 17, 2015

I have been tasked with writing an update query to update a table with more than 150 million rows of data. Here are the table structures:

Source Tables :

OC
CREATE TABLE [dbo].[OC](
[OC] [nvarchar](255) NULL,
[DATE DEBUT] [date] NULL,
[DATE FIN] [date] NULL,
[Code Article] [nvarchar](255) NULL,
[INSERTION] [nvarchar](255) NULL,

[Code] ....

The update requirement is as follows:

DECLARE @Counter INT=0 --This causes the @@rowcount to be > 0
while @@rowcount>0
BEGIN
    SET rowcount 10000
    update r
    set Comp=t.Comp

[Code] ....

The update took more than 48h and didn't terminate , how to accelerate it ?

View 6 Replies View Related

AFTER INSERT Trigger Takes Forever On A Large Table (20 Million Rows)

Aug 30, 2007

I have a row that is being used log track plays on our website.

Here's the table:


CREATE TABLE [dbo].[Music_BandTrackPlays](
[ListenDate] [datetime] NOT NULL DEFAULT (getdate()),
[TrackId] [int] NOT NULL,
[IPAddress] [varchar](20)
) ON [PRIMARY]


There's a CLUSTERED INDEX on ListenDate ASC and a NON CLUSTERED INDEX on the TrackId.

I have a TRIGGER on the Music_BandTrackPlays table that looks like the following:


CREATE TRIGGER [trig_Increment_Music_BandTrackPlays_PlayCount]
ON [dbo].[Music_BandTrackPlays] AFTER INSERT
AS
UPDATE
Music_BandTracks
SET
Music_BandTracks.PlayCount = Music_BandTracks.PlayCount + TP.PlayCount
FROM
(SELECT TrackId, COUNT(*) AS PlayCount
FROM inserted
GROUP BY TrackId) AS TP
WHERE
Music_BandTracks.TrackId = TP.TrackId


When a simple INSERT statement is done on the Music_BandTrackPlays table, it can take quite a long time. When I remove the TRIGGER the INSERTs are immediate. The Execution plan for the TRIGGER shows that a 'Inserted Scan' is taking up most of the resources.

How exactly is the pseudo 'inserted' table formed?

For now, I think the easiest thing to do is update my logging page so it performs 2 queries. One to UPDATE the Music_BandTracks table and increment the counter, and perform the INSERT into the Music_BandTrackPlays table seperately.

I'm ok with that solution but I would really like to understand why the TRIGGER is taking so long. The 'inserted' pseudo table will be 1 row 99% of the time. Does SQL Server perform a table scan on all 20 million rows in order to determine what's new and put it in the inserted pseudo table?

Thanks!

View 6 Replies View Related

Transact SQL :: Adding A Column To A Large (100 Million Rows) Table With Default Constraint?

Apr 24, 2013

IF NOT EXISTS (SELECT TOP 1 1 FROM dbo.syscolumns WHERE id = OBJECT_ID(N'dbo.Employee) and name = 'DoNotCall')
BEGIN
ALTER TABLE [dbo].[Employee] ADD [DoNotCall] bit not null Constraint DoNot_Call_Default DEFAULT 0
IF ( @@ERROR <> 0 )
GOTO QuitWithRollback
END

It just takes a LOT of time in SQL Server Management studio. I have to cancel the query and cancelling takes a whole lot time. I am using SQL Server 2008.

View 4 Replies View Related

Add Rows To A DataSet Without Updating The MS SQL Server?

Jan 9, 2006

I am using ASP.NET 2.0 WebForms and I was trying to use a DataSet to add rows programatically without adding the actual records to the MS SQL Server Databases. Is this possible or should I be doing this another way?
DataSet myDS = new DataSet();DataTable myTable = new DataTable("table1");myTable.Columns.Add("col1", typeof(string));myDS.Tables.Add(myTable);myTable.Rows.Add("MyValue");
Thanks.

View 1 Replies View Related

Updating A Production Database With Back Up Of Development Database

May 11, 2007

Can someone provide a step by step tutorial for this? I'd like to safely update a database that is used for a website without much or any downtime.

View 1 Replies View Related

Updating Multiple Rows

Mar 17, 2008

How would I update a table where id = a list of ids?Do I need to parse the string idList? I am being passed a comma seperated string of int values from a flex application.example: 1,4,7,8  Any help much appreciatedBarry  1 [WebMethod]
2 public int updateFirstName(String toUpdate, String idList)
3 {
4 SqlConnection con = new SqlConnection(connString);
5
6 try
7 {
8 con.Open();
9 SqlCommand cmd = new SqlCommand();
10 cmd.Connection = con;
11 cmd.CommandText = "UPDATE tb_staff SET firstName = @firstName, WHERE id = @listOfIDs";
12
13
14
15 SqlParameter firstName = new SqlParameter("@firstName", toUpdate);
16 SqlParameter listOfIDs= new SqlParameter("@listOfIDs", idList);
17
18
19
20 cmd.Parameters.Add(firstName);
21 cmd.Parameters.Add(listOfIDs);
22
23 int i = cmd.ExecuteNonQuery();
24 con.Close();
25 return 1;
26
27 }
28 catch
29 {
30 return 0;
31 }
32 }
33
34
  

View 3 Replies View Related

Updating 2 Or More Rows At The Same Time

Jul 23, 2004

Hi,

I am working on a SQL Server table designed by a partner company and cannot change the data structures. There is a table with a list of people available for calling out to a security system (keyholders).

From a web form, I need to allow my users to change the telephone calling order of the keyholders in the table.

The two important fields are AccountCode nVarChar and CallOrder nVarChar - where AccountCode + CallOrder must be unique.

As an example, the table may contain records with the following data..

1234, 1, Fred
1234, 2, Bert
1234, 3, Bob

If the user wants to make Bob the number 1 keyholder, Fred number 2 and Bert number 3 - what is the best practice for me to approach the update ?

Is this a job for ADO.Net or T-SQL ?

Thanks in advance.

Steve.

View 3 Replies View Related

Updating/Inserting Rows

Jul 16, 2002

Does anybody have a sample SQL script that will select table A and compare it to table B. If a row exists in both table A and table B, it will update the columns in table B with the columns in table A. If the row does not exist in table B, it will insert a row in table B using the row in table A. Is this possible?

Thanks Mitch

View 2 Replies View Related

Updating Grouped Rows...

Jul 9, 2004

I am new here, and I am sure this is a simple query, but im being forcefed database chores from my job, so i have to teach this stuff to myself/get help from places like this,
I need help with a query,
lets say that there are columns a,b,c,d,e,f,g
if columns c,d,e are the same, than I want the info in column g changed to the info in column b in the first record of that group
the reason I am doing this is,
I have like items (sku's) grouped in my database, and i want to create a blanket part number for skus that have matching descriptions which is the information in colums c,d,e,
I want to link them to the part number of the first product with that description, and add that part number in a new column at the end of the grouped SKU's record

this is what i start out with

a b c d e f g
2 4 5 6 9 8
2 5 5 6 9 9
2 7 5 6 0 5
1 2 3 4 5 6
1 3 3 4 5 7
1 4 3 4 5 8
1 5 3 4 5 9

i want to end up with
a b c d e f g
2 4 5 6 9 8 4
2 5 5 6 9 9 4
2 7 5 6 0 5 7
1 2 3 4 5 6 2
1 3 3 4 5 7 2
1 4 3 4 5 8 2
1 5 3 4 5 9 2

View 2 Replies View Related

Updating N Rows At Once - Best Practice (again)

Aug 14, 2006

I have put this thread on as a follow up discussion to some concepts that surfaced here:http://www.dbforums.com/showthread.php?t=1208316here:http://www.dbforums.com/showthread.php?t=1604936and also here:http://www.dbforums.com/showthread.php?t=1606555 The gist is: Is it ever correct excusable advisable to pass a non-normalised csv list of data to a stored procedure? When I first asked this I thought it was simply an implementation issue. Pat thinks it is but he also holds an ideological objection too. If I seem a little obsessed with this topic it is simply thata) I don't want to hijack yet another threadb) I am Anyway - I PMed Pat about this and he (quite rightly) suggested I post publicly. So here we go: I think I have struggled a little getting my head round the concepts you are using. And here (I think) is why: Normalisation starts with a logical model and ends with the physical implementation of the database (as I see it...). Relational theory pervades these two but extends further to SQL and database objects (views, sprocs etc). I work exclusively (with the exception of some legacy I stuff I simply must update) with the sproc database-API methodology. As such, my databases are abstracted from the other tiers. Now - I think this is the crux of it for me - I typically think of the sproc API as another tier (perhaps tier 1.5). Although the database itself and the objects that manipulate it are housed by a single application I don't actually think of the sprocs as part of the database. They are dependant upon it and inextricably intertwined with it but they are not a part of it in my eyes. As such, I don't worry about producing denormalised output and I have never (up until you got me thinking about it) worried about accepting non-normalised input. The sproc is not the database. The sproc is just a code procedure that acts upon the database. As such - if it accepts a typless, denormalised input but subsequently types (validates) it and normalises it before it makes any changes to the data within the database then I struggle to see the scale of the problem. I am writing as something of a stickler for normalisation myself - I think that rather than querying how closely one should adhere to normalisation principles I am pondering at what point these principles come into play. As a follow up question - if SQL Server sprocs were to accept (for example) a table variable as an argument would this meet your requirement (since the data is verified and normalised before it gets to the sproc) or would you require a solution like this to be an ANSI Standard too? So - that's the nub of it - comments welcome ona) The practicality of passing csv lists to sprocs and subsequently parsing them and b) How "correct" or "incorrect" this is. Blindman and Pat both object on both counts (I think).Thrasmachus didn't have a problem with it (however to be fair that was a very early response to a specific question I asked). I don't hold a particularly strong opinion either way – in fact the “loop through the changes at the client and fire the sproc n times” route is probably much easier to code in any case - I'm just interested in the idea that it violates relational theory in some way.

View 13 Replies View Related

Updating Rows Of A Column

Apr 10, 2008

Hi, I need to update column week14 in table PastWeeks with data from Eng_Goal and then result of some calculation from table AverageEngTime in the row Goal and Used respectively. I used the following and was not successful. Please advice. Thank you.

Update Pastweeks
set
week14 = (SELECT Eng_Goal,((Mon_Day + Mon_Night + Tue_Day + Tue_Night + Wed_Day + Wed_Night + Thu_Day + Thu_Night + Fri_Day + Fri_Night + Sat_Day + Sat_Night + Sun_Day + Sun_night)* 100/168) FROM AverageEngTime where Shifts = 'Average')
where Weeks = ('Goal','Used' )

View 8 Replies View Related

Need Help In Updating Multiple Rows

Jul 23, 2005

Hi,New to writing sql scriptI get this error in my sql scriptServer: Msg 512, Level 16, State 1, Line 1Subquery returned more than 1 value. This is not permitted when thesubquery follows =, !=, <, <= , >, >= or when the subquery is used asan expression.The statement has been terminated.I want to write a single sql script which will update column 1 (FK)in table A with column 1 (PK) in table BHere's an example of what I need to doTable DATA_A-------------column C1 (Foreign Key)111111222222333333334455Table DATA_B------------column C1 (Primary Key) Column C2 Column C311 ABC NULL12 ABC 2004-12-1222 EFG NULL23 EFG 2003-12-1233 HIJ NULL34 HIJ 2003-12-1244 KLM 2005-02-0255 JJJ NULLI need to update Table DATA_A set column C1 with 11 data to point toTable DATA_B column C1 with 12 data. Currently, the problem is theTable DATA_A Column C1 is pointing to the wrong primary key which hasNULL data in COLUMN C3. I need to point to the correct Primary Key withDate filled in Column 3. The two primary key is tied together bycolumn C3.Here's my SQl scriptUPDATE DATA_A SET C1 =(SELECT C1 FROM DATA_BWHERE C2 in(SELECT B1.C2 FROM DATA_B B1WHERE EXISTS(SELECT * FROM TABLE_B B2 WHERE B2.C3 is NOT NULL)AND EXISTS(SELECT * from TABLE_B B2 WHERE B2.C3 is NULL)AND B2.C2 = B1.C2GROUP BY B1.C2HAVING COUNT(B1.C2) = 2)AND C3 IS NOT NULL)WHERE(SELECT C1 FROM DATA_BWHERE C2 in(SELECT B1.C2 FROM DATA_B B1WHERE EXISTS(SELECT * FROM TABLE_B B2 WHERE B2.C3 is NOT NULL)AND EXISTS(SELECT * from TABLE_B B2 WHERE B2.C3 is NULL)AND B2.C2 = B1.C2GROUP BY B1.C2HAVING COUNT(B1.C2) = 2)AND C3 IS NULL)Thanks - Been struggle at this for a whileMLR

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved