Transact SQL :: Updating A Table With 45 Million Records

Jul 21, 2015

I am trying to update a large table which consists of 45 million records , it is taking more than 2 days to the update , below is my approach

1. The table has only one clustered index and no other indexes on the table.
2. I am updating in batches say 20000 record-wise.
3. Changed the recovery mode to bulk logged and auto-growth size is set to  300MB and there is enough space in my disk for transaction log .

But still the query is running slowly.

View 10 Replies


ADVERTISEMENT

DB Engine :: Deleting 1 Million Records From Transaction Table Of 10 Million Data On 24/7 Environment

Jun 12, 2015

I have a requirement to delete 1 Million records from a table having 10 Million data and it's being queried on 24/7 basis (don't have a downtime). how can I achieve that?

View 13 Replies View Related

Updating 4 Million Records

Aug 30, 2006

Meg writes "Hi,

I have a table that has 4+ million records. I need to update those records. I am facing some performance issue. Can someone please advice?

update stage
set batch_status = 1
where update_status = 0


Update transaction
Set aId = s.aId,
b = s.b,

from stage s
Where s.aId = transaction.aId
and s.batch_status = 1


Update stage
Set update_status = 1,
batch_status = 2

where

batch_status = 1

When I run the above query with "set rowcount 1000", it runs in one minute. When I run the query for "set rowcount 10000", it runs in 1 hour 56 minutes. Can someone help me to optimize it?

Thanks.
Meg"

View 4 Replies View Related

SQL Server 2012 :: Updating 25 Million Records In Batches

Nov 10, 2014

I have 2 tables with this schema

CREATE TABLE tableValues(
[LASTENCRYPTIONDT] [datetime] NULL,
[ENCRYPTIONID] [int] NULL,
[NAME] [varchar](50) NULL

[Code] ....

I want to update tableToUpdate in batches of 5000 per batch and set the lastenecryptionDT to null based on the the join to the tableValues using the column ENCRYPTIONID, and also output updated rows into another table. Incase I would need to do a rollback.

View 3 Replies View Related

Updating A Column In A Table That Contains 50 Million Rows

Feb 27, 2008

I'm looking for some performance assistance on updating a column value in a table that contains approximately 50 million rows. I have a permanent table in another database that has the key column and value to be set. My query is listed below, but I'm afraid it will run quite awhile. Any suggestions would be appreciated.

update mytable
set column2 = b.column2
from mytable as a
join mytable1 as b
on a.column1 = b.column1



There is a one to one relationship between the two tables.

View 8 Replies View Related

Indexing A Table With 80 Million Records

Mar 26, 2004

i have a directory database with approx. 80 million records. i am feeding the database with bulk_insert. Indexing one of the fields took about 8 hrs. After indexing when i run queries with the indexed field the response time is under 1 sec. However if i run select queries with like on non-indexed fields it takes more than 2 mins. So i decided to index 4 other fields in the database and it looks like the indexing process is going to run for 2 days.
i am a novice in SQL database design and i am not sure if this is the best way to index the table. i am just using create index. Any suggestions / advice welcome.

View 5 Replies View Related

SQL 2012 :: 1.5 Million Records Into Temp Table

Sep 23, 2014

I come from a web based world were loading 1.5 million records into a temp table is suicide. I’m doing more data warehouse stuff now and I was looking into optimizing a buddies proc and noticed he was loading 1.5 million records into a temp table. We had a discussion about it because being from a web world I was drastically against it. He on the other hand didn’t feel it was an issue being it gets called once maybe twice a day. The tempdb is set to autogrow and it is on a different drive than all the other databases on the box. It has one ldf and mdf. He’s creating an index on the table after load. Why we shouldn’t be loading 1.5 million recs into temp table?

View 5 Replies View Related

Need Suggestion On Loading A 50 Million Records Table From Oracle

Feb 16, 2006

All,

I need to load a 50 million records table monthly. Any suggestion about the best/fast way to do it?

Thanks a lot

View 2 Replies View Related

T-SQL (SS2K8) :: Table With 3 Million Plus Records Taking Half A Minute?

Aug 6, 2015

I have a table that I need to do some computations on all the data but first I need to remove the duplicate records and insert the results into a destination table. Here's the example below. My table has 3.1 million rows. I have tried using the DISTINCT and the GROUP BY but both ways to select the data takes about half a minute to run. I'm wondering if there is a way to increase performance. Users are ok with this time since the process runs overnight but improving it won't hurt. I do have a clustered index on these fields but that doesn't seem to improve any.

SELECTDateYear ,
DateMonth ,
Nbr ,
Nbr1 ,
Nbr2 ,
Datafield1 ,
Datafield2,

[code].....

View 7 Replies View Related

Transact SQL :: Query To Update A Table With More Than 150 Million Rows Of Data?

Sep 17, 2015

I have been tasked with writing an update query to update a table with more than 150 million rows of data. Here are the table structures:

Source Tables :

OC
CREATE TABLE [dbo].[OC](
[OC] [nvarchar](255) NULL,
[DATE DEBUT] [date] NULL,
[DATE FIN] [date] NULL,
[Code Article] [nvarchar](255) NULL,
[INSERTION] [nvarchar](255) NULL,

[Code] ....

The update requirement is as follows:

DECLARE @Counter INT=0 --This causes the @@rowcount to be > 0
while @@rowcount>0
BEGIN
    SET rowcount 10000
    update r
    set Comp=t.Comp

[Code] ....

The update took more than 48h and didn't terminate , how to accelerate it ?

View 6 Replies View Related

SQL 2012 :: Snapshot Getting Corrupted After Insert Update Few Million Records Into A Table

Mar 12, 2015

We are facing a weird scenario in which the snapshot is getting corrupted after insertupdate few million records in to a table .

SQL Server 2012
windows server 2008 R2
service pack 1
64-bit OS

View 1 Replies View Related

Transact SQL :: Adding A Column To A Large (100 Million Rows) Table With Default Constraint?

Apr 24, 2013

IF NOT EXISTS (SELECT TOP 1 1 FROM dbo.syscolumns WHERE id = OBJECT_ID(N'dbo.Employee) and name = 'DoNotCall')
BEGIN
ALTER TABLE [dbo].[Employee] ADD [DoNotCall] bit not null Constraint DoNot_Call_Default DEFAULT 0
IF ( @@ERROR <> 0 )
GOTO QuitWithRollback
END

It just takes a LOT of time in SQL Server Management studio. I have to cancel the query and cancelling takes a whole lot time. I am using SQL Server 2008.

View 4 Replies View Related

Transact SQL :: Paging Of Records Which Are Keep Updating

May 7, 2015

I would like to use the following code for querying summary records with paging.

DECLARE @PageNumber AS INT, @RowspPage AS INT
SET @PageNumber = 1
SET @RowspPage = 10
SELECT * FROM (
SELECT ROW_NUMBER() OVER(ORDER BY create_date) AS NUMBER,
* FROM summary
) AS TBL
WHERE NUMBER BETWEEN ((@PageNumber - 1) * @RowspPage + 1) AND (@PageNumber * @RowspPage)
ORDER BY create_date

Paging is implemented for fast response since the data pool is very large up to 10000000.

The above query works fine in testing. However, in reality, since new records are keep inserting to the tables, I have concern about the accuracy of viewing another page of result.

E.g.  At 12:00pm, the result of page 1 (5 per page)  is
R20, R19, R18, R17, R16

After 2 mins, 12:02pm, the user press next page button
Since records R21, R22, R23, R24, R25, R26 are inserted
page 2 result would be R21, R20, R19, R18, R17

So the result is showing many records same as page 1 which has already been seen. Could this situation be improved?

View 2 Replies View Related

Deleting Old Records Is Blocking Updating Latest Records On Highly Transactional Table

Mar 18, 2014

I have a situation where deleting old records is blocking updating latest records on highly transactional table and getting timeout errors from application.

In details, I have one table called Tran_table1 in OLTP database. This Tran_table1 is highly transactional table, it will receive data for insert/update continuously

While archiving 2 years old records from Tran_table1 into Tran_table1_archive in batches(using DELETE OUTPUT INTO clause), if there is any UPDATEs on Tran_table1,these updates are getting blocked and result is timeout errors in application.

Is there any SQL Server hints to avoid blocking ..

View 3 Replies View Related

Transact SQL :: Retrieve All Records From Parent Table And Any Records From Child Table

Oct 21, 2015

I am trying to write a query that will retrieve all students of a particular class and also any rows in HomeworkLogLine if they exist (but return null if there is no row). I thought this should be a relatively simple LEFT join but I've tried every possible combination of joins but it's not working.

SELECT
Student.StudentSurname + ', ' + Student.StudentForename AS Fullname,
HomeworkLogLine.HomeworkLogLineTimestamp,
HomeworkLog.HomeworkLogDescription,
ROW_NUMBER() OVER (PARTITION BY HomeworkLogLine.HomeworkLogLineStudentID ORDER BY

[Code] ...

It's only returning two rows (the students where they have a row in the HomeworkLogLine table). 

View 3 Replies View Related

Help On Updating 1.3 Million Rows On The Production Server

May 4, 2000

I need to update about 1.3 million rows in a table of mine.
I am getting the data from one of the columns of the same table and
updating the new column.
I am doing this using a cursor which I have put in a stored procedure.
As this is a production table which users might be accessing.It is a
web based application and I can't slow the system down.
So I am willing to run the stored prcedure during off peak hours.
However, do I need to put this in a transaction?
If I did put it in a transaction what type of isolation level should I
opt for?
Data integrity is very important for me and I don't mind to compromise
on the performance.
I am doing this because one of the columns which has "short description"
entry is has become too small for business purposes and we want to increase it's
length from varchar(100) to varchar(150).
As this is SQL 6.5, I can't increase the lenght of the column.
So Iadded a new column and will run the stored proc.
What precautions are to be taken?
This is on a high priority basis and very important too.

Thanks in advance...

Stored procedure code:

USE DB_Registration_Dev
GO
IF EXISTS (SELECT NAME FROM SYSOBJECTS WHERE NAME='usp_update_product' AND TYPE='P')
DROP PROCEDURE usp_update_product
GO
CREATE PROC usp_update_product
AS
DECLARE @short_desc varchar(100)
DECLARE @prod_id int

DECLARE sdesc_curs CURSOR
FOR
SELECT [Product].[product_id] , [Product].[short_description]
FROM Product

OPEN sdesc_curs

FETCH NEXT FROM sdesc_curs
INTO @prod_id, @short_desc

WHILE @@FETCH_STATUS = 0
BEGIN
UPDATE Product
SET [Product].[sdesc] = @short_desc
WHERE Product_id=@prod_id
FETCH NEXT FROM sdesc_curs
INTO @prod_id, @short_desc
IF @@FETCH_STATUS <> 0
PRINT ' Finished Updating the table...go ahead and have fun ...! '
END
DEALLOCATE sdesc_curs
GO

View 1 Replies View Related

Updating Records From Another Table

Jun 9, 2005

I'm having a bit of problem putting together a query that will update records in one table from another table.  I've got 2 tables lets call them tblA and tblB.In tblA there is EID(int), QID(int), OID(int) and in tblB OID(int) and QID(int). Also there is an input parameter @EID.What I want to have happen is when someone inputs @EID then tblB gets updated from tblA.  To give you a heads up there are no PKs or FKs in either of the tables.If there is an OID in tblA it takes the QID from tblA and places it in tblB where OID from tblA and tblB match.Hopefully this makes sense I thought that I could do something like this:CREATE PROCEDURE test3(@EID int)AS UPDATE    tblUsedSET              QuestionID =    (select QuestionID    from tblExamQuestions      where ExamID = @EID)WHERE     OrderID =    (select OrderID   from tblExamQuestions      where ExamID = @EID)GOBut I get an error that says that there are to many records being returned by the subqueries. Winston

View 11 Replies View Related

Updating Records Ina Table

Nov 29, 2006

Hi there!!

I have a table say Emp with index as (Empid) which is a autoincreament field.

How do I update names of already existing rows having identified Empid.

Complexity lies with the fact that I cannot use Emp table as Destination table because if i do so then it will insert new rows and empid will change.



Thanks and Regards

Rahul Kumar, Software Engineer

View 4 Replies View Related

Transact SQL :: Updating Table Set Sent To 1 For All Status IDs

Nov 30, 2015

SID statusid listindex listsize sent
1           12     25        25       1
2           12    25        50       0
3            12   75       150       1
4            14     25     25        1

I have a table like above where cid is unique but status is not for all the status id i need put 1 as they sent out .but till now i used max listindex because they used to send files sequentially but now list index is random so how to update sent to 1 for all the status ids.

View 5 Replies View Related

Transact SQL :: Grouping And Updating The Table

Dec 5, 2015

I have a table like this. 

AS-IS
 
Column1 Column2  count
a b 20
a b 25
c d 12
c d 22

And I need to update the same as below.

TO-BE
 
Column1 Column2  count
a b 45
c d 34

How to do it?

View 4 Replies View Related

SQL INSERT 1.6 Million Records

Jan 27, 2006

I am currently working on a simple page to insert 1.6 million UK postcode records into an SQL server table. The table has three columns for the postcode, longditude coordinate and lattitude coordinate. The data is sourced from a pipe (|) delimited txt file and inserted into the database using a FOR loop. The problem I have is that the page will hang after inserting only 10,000 records, the page displays either an invalid View State error or a page cannot be found error.
Now I assume the viewstate error stems from the fact that there is a form on the page which simply contains a button to execute the script and a few labels to show the progress. But without the form and associated viewstate the insert still fails to complete.... any ideas?? Would I be better running this on a thread or should I just do it in stages and be patient. I have now modified the page to read the database on load and pick up from where it crashes?

View 2 Replies View Related

56 Million Records Search

Jul 20, 2005

Hey folks...So I have a table that looks like this:CREATE TABLE [tblStation] ([CAMPAIGN] [varchar] (8),[LISTNUM] [varchar] (10),[PHONE] [varchar] (10),[EVENTTIME] [datetime] ,[STATION] [int],[OPERATOR] [varchar] (16),[EVENTCODE] [varchar],[CALLSPAN] [decimal](18, 0),[FDISP] [int],[RECORDNUM] [varchar],[STC] [varchar],[PROMOC] [varchar],[EXP_CAMP] [varchar],[PROMO3] [varchar],[MAXATT] [char],[LISTNAME] [varchar],[SITENAME] [char],[Row_id] [int] IDENTITYIt's taking nine seconds to run the following command:SELECT count([fdisp])FROM [TrunkFiles_new].[dbo].[tblStation] WITH (NOLOCK)WHERE fdisp IS NULLAnyone familiar with a table of this size having performance likethis? The [fdisp] column has a non clustered index on it.Thanks in advance...

View 1 Replies View Related

Updating Multiple Records In A Single Table?

Sep 3, 2014

I'm trying to update a checkbox from "False" to "True" within a single table for multiple records. I can update a single record using the script below. However, I'm having trouble applying additional Id's to the string.

(Works) - Update Name_Demo set KEY_CONTACT = 'true' where ID = 225249

(doesn't work) - Update Name_Demo set KEY_CONTACT = 'true' where ID = '225249, 210014, 216543'

It says query executes successfully but returned no rows.

View 3 Replies View Related

Updating A Specific Amount Of Records In A Table

Jun 20, 2007

I have a table with 35,000 records in it. I want to update a value in column A for only the first 5000 records, leaving the value in Column A for the remaining 30,000 records as it is now. What would be the command I would use to update Column A for the first 5000 records.

Thanks,

View 4 Replies View Related

DWH Problem: Updating A Table With Every 1000 Records A Checkpoint

Jul 20, 2005

Hi,Currently we're a building a metadatadriven datawarehouse in SQLServer 2000. We're investigating the possibility of the updatingtables with enormeous number of updates and insert and the use ofcheckpoints (for simple recovery and Backup Log for full recovery).On several website people speak about full transaction log and thepace of growing can't keep up with the update. Therefore we want tocreate a script which flushes the dirty pages to the disk. It's notquite clear to me how it works. Questions we have is:* How does the process of updating, insert and deleting works with SQLServer 2000 with respect to log cache, log file, buffer cache, commit,checkpoint, etc?What happens when?* As far as i can see now: i'm thinking of creating chunks of data of1000 records with a checkpoint after the Query. SQL server has thedefault of implicit transactions and so it will not need a commit.Something like this?* How do i create chunks of 1000 records automatically withoutcreating a identity field or something. Is there something like SELECTNEXT 1000?Greetz,Hennie

View 6 Replies View Related

How Well SQL Server Can Support 300 Million Records...

Nov 16, 2001

How well SQL Server can support 300 million records...
Any body is working on big database like this. can anyone give me some input on this. it's going to be 60GB database size.

View 1 Replies View Related

Fastest Way To Update 20 &#043; Million Records

Mar 19, 2008

Hello,
What is the fastest way to update 20million records in our database.
I have tried to do a simple update statement like this:
update trail_log with (tablockx, holdlock)
set trail_log .entry_by = users.user_identity
from users
where trail_log.entry_by = users.user_id

but it take 10 plus hours to run since it cannot commit the transactions until the very end. So was was thinking that I need to commit in batch like after 50K but that is slow as well.
Set rowcount 50000
Declare @rc int
Set @rc=50000
While @rc=50000
Begin
Begin Transaction
update trail_log With (tablockx, holdlock)
set trail_log.entry_by = users.user_identity
from users
where trail_log.entry_by = users.user_id
and trail_log.entry_by not like '%[0-9]%'
Select @rc=@@rowcount
--Commit the transaction
Commit
End
go
I have let the above statement run for 1.5 hours and it only update 450000 rows. Any ideas...
Maybe I'm doing it wrong. Please Help!!

View 1 Replies View Related

Efficiency: 40 Million Records Script.

Oct 12, 2007

Hi all,


I have a sql script that updates records in a table with 40 million records.

There is some functionality in the script that could be put away in functions for code reuse/elegance.

Functions would cause execution overhead.

What else could I use besides functions that would allow me the code reuse and not compromise the execution over head? Is there any thing like includes in TSQL that would allow me to do so?

TIA..

View 4 Replies View Related

DB Engine :: Updating Table Fields With Multiple Target Records

Apr 30, 2015

I am trying to multiple update records in Table B from a single record in Table A.  To identify the records that need to be updated I used this:

Select
Table1.cfirstnameas'Table
1 First',Table1.clastnameas'Table
1 Last',Table1.ifamilyid,CR.icontactid,CR.lLiveswithStudent,

Table2.cfirstnameas'Table
2 First',Table2.cLastNameas'Table
2 Last',Table2.ilocationidas'Table
2 Location',Table1.iLocationIDas'Table
1 Location'

fromTable1

JoinTable3
CRonTable1.istudentid=CR.istudentid

JoinTable2
onCR.icontactid=Table2.iContactID

whereCR.lLivesWithStudent=1
andTable1.ifamilyid>0
andTable1.ilocationid<>Table2.iLocationIDandTable1.lCurrent=1

I need to update the ilocationid from Table 1 to all Table 2 records related to Table 1but there is no direct relation from Table 1 to Table 2.  I needed Table 3 to make the connection from Table 1 to 2.

View 2 Replies View Related

Free Text Search For 2 Million Records

Apr 23, 2007

Hi

I have a new client with an existing system that has just over 2 million business listings in one table. Each business listing is associated with one business category.

* Company Table (around 20 fields):

companyID
companyName
categoryID
state
postCode
etc.

* Category Table (5 fields)

categoryID
categoryName
etc.

We are using MSSQL 2005 Express Edition with Advanced Services

A free text search needs to be performed on the companyName and categoryName limited by region (state and or postcode).

1) What kind of response times should I expect for the free text search (I have not used the free text search before)

2) How should I index the companyName and categoryName so they are both used in a joined query? i.e. Do I just configure the free text search index on each field separately and it should work?

Any suggestions appreciated.

Best Regards

Kevan

View 2 Replies View Related

T-SQL (SS2K8) :: Compare Tables With More Than 4.9 Million Records?

Mar 18, 2014

I want to compare ONLY 1 Column values from 2 tables having more than 4.9 million records. There is a difference of 4000 rows between the 2 tables.

SELECT ID From TABLE1 where ID not in (SELECT DISTINCT ID From TABLE2)

My above query took nearly 4.5 hours to run and I had to cancel it. Is there a better way to write the query . I just want to compare the ID - column values which are missing in TABLE2

View 7 Replies View Related

Join 2 Tables With More Then Million Records With 2 Parameters

Apr 8, 2008

Hi
I have 2 tables with more then million records in each and I have to perform full outer join.
The problem is that the join clause contains 2 different parameters (int and string) like this:

Select *
From a full outer join b
On a.cli = b.cli OR a.reference = b.reference

Because of the OR in the clause and the million records the query is infinite. If I change to one rule only then it works fine.

How can I join these 2 big tables with 2 rules?
Thanks
Itay

View 2 Replies View Related

Fuzzy Grouping: Any Success With &&> 3 Million Records?

May 18, 2006

I have tried to process > 3 million Fuzzy grouping records on two different servers with no success. 3 mill works but anything above 4 mill doesn't. Some background:

We are trying to de-dup our customer table on: name (.5 min), address1 (.5 min), city (.5 min), state (exact). .8 overall record min score.
Output includes additional fields: customerid, sourceid, address2, country, phonenumber
Without SP1 installed I couldn't even get a few hundred thousand records to process
Two different servers - same problems. Note that SSIS and SQL Server are running locally on both
The higher end server has 4GB RAM, the other 2.5 GB RAM. Plenty of free disk space on both
SQL Server is configured to use 2 GB of RAM max
The page file is currently at 15GB

After running a number of test on both servers trying different batch sizes etc. the one thing I noticed is that it seems to always error out when SSIS takes over and starts chewing up all the available RAM. This happens after the index is created and SSIS starts "warming caches". On both servers SQL Server uses up about 1.6GB of RAM at this point while SSIS keeps taking over RAM until all physical RAM is used up.

Some questions:

Has anyone been able to process more then 3 million records and if so what is your hardware configuration?
Should we try running SSIS from a different server so it has access to the full amount of physical RAM? (so it doesn't have to fight for RAM with SQL Server)
Should we install Win 2003 Enterprise Server so we can add more RAM?
Any ideas why switching to the page file might be causing errors?

Thanks!!

Keith Doyle





View 17 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved