Deleting Massive Data From A Table

Jan 20, 2014

I have to delete a ton of data from a SQL table. I have a unique identifier called the version. I would like to use if not in these versions then delete. I tried to using the statement below, but learned the hard way that it created an error this is the message I got....

Msg 9002, Level 17, State 4, Line 3...

The transaction log for database 'MonthEnds' is full due to 'ACTIVE_TRANSACTION'.

I was reading about truncate, I am not sure how I would do this or how I would setup the statement.

Delete Products
where versions were not in (('48459CED-871F-4971-B888-5083990332BC','D550C8D3-58C7-4C74-841D-1C1675F19AE3','C77C7817-3F04-4145-98D3-37BB1610DB35',
'21FE83FA-476D-4604-80EF-2ED57DEE2C16','F3B50B81-191A-4D71-A406-011127AEFBE1','EFBD48E7-E30F-4047-909E-F14DCAEA4181','BD9CCC41-D696-406B-
'C8BEBFBC-D362-4D0F-A555-B281FC2B3023','EFA64956-C2CF-41FC-8E21-F060597DAFCB','77A8DE56-6F7F-4490-8BED-AA6809B947EF','0F4C1E5F-B689-4DCB-

[code]....

View 2 Replies


ADVERTISEMENT

Delete Records From A Massive Table (heap)

May 21, 2013

So I've stumbled across an audit table on one of our systems that has reached a hearty 180M rows in size.

The table is a heap (no indexes whatsoever).

Each record has a datetime value indicating when it was created.

I need to delete everything that was created prior to the last 6 months; what is my best plan of attack?

View 12 Replies View Related

Trouble Deleting Data In Table

Mar 9, 2007

I have a table where I want to delete some data from but I get this error.

You might have a record that has a foreign key value related to it, or you might have violated a check constraint.

What to do????

View 1 Replies View Related

SQL 2012 :: SP For Deleting All Data From A Table

Mar 13, 2014

Is this close to the correct syntax for a stored procedure for deleting all the data from a particular table... or is there a better way?

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE TruncateTmpBank

[Code] ....

View 3 Replies View Related

Data From The Table Deleting Automaticaly.

Feb 13, 2008

Hi gurus,

The data is automaticaly deleting from one perticular table at every night from last week onwords. I have created a delete trigger to find it out. But Nothing was recorded. There is no jobs except maintainance plans. Nothing in event viewer too. The database recovery model is simple. How can i solve this problem
Please advise me to solve this problem

Thanks
Krishna.

View 11 Replies View Related

Deleting Data By Comparing To Another Table

Jul 20, 2005

I have an entry form allowing customers to enter up to 15 skus (productid) at a time, so they can make a multiple order, instead of enteringone sku, then submitting it, then returing to the form to submit thesecond one, and so forth.From time to time, the sku they enter will be wrong, or discontiued, soit will not submit an order.Therefore, when they are done submitting their 15 skus through the orderform, I want a list showing them all of those skus that came back blank,or were not found in the database.I'm doing this by creating two tables. A shopping cart, which holds allthe skus that were returned, and a holding table, that holds all theskus that were submitted. I want to then delete all the skus in theholding page that match the skus in teh cart (because they are goodskus) which will then leave the unmatched skus in the holding table.I'll then scroll out the contents of the holding table, to show them theskus that were not found in the database.(confused yet?)So what I want to do is have some sql that will delete from the holdingtable where the sku = the sku in the cart. I've tried writing this, butit dosn't work.I tiried this delete from holding_table where sku = cart.skuI was hoping this would work, but it dosn't. Is there a way for me to dothis?Thanks!Bill*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!

View 2 Replies View Related

Deleting Data From Primary Table

Mar 20, 2008



Hi Everybody,

Kindly let me know if there is a way of deleting data from primary table without deleting data from its corresponding foreign key table.

Thanks & Regards

View 2 Replies View Related

Deleting Data Using Table Prefix

May 30, 2006



I can run a select to retrieve data using a prefix 'a' for the specific table involved. However when I try to run a delete using the same criteria it fails telling me

Msg 102, Level 15, State 1,.......Line 1

Incorrect syntax near 'a'



The Select statement looks like:

select count(*) from schema.table a where a.customer_id=1234

The Delete looks like:

delete from schema.table a where a.customer_id=1234

What am I doing wrong here? and how can I prefix the table, because the command I want to run is much more complicated than the example above and it needs the prefix

View 3 Replies View Related

Deleting The Master Table Withour Deleting The Child Tables

Aug 9, 2007

Hi
i have to delete the master table data without deleting the child table records,is there any solution for this,  parent table has relation with the child table.
regards
vinod.t.v

View 9 Replies View Related

Massive Data Import, How To Avoid Dublicates?

Dec 4, 2006

Hello,

I am currently working on a project where I have to import a huge amount of data from CSV files into a database.
I don't want to have dublicate keys in my table, but my CSV file contains them. That means the line more at the end of the file contains the mor up to date information that I have to store.

I try to fix this problem since serveral weeks, but my algorithm is very slow and blocks all other processes on the server. At the moment I am copying all records into a temp table that occure more than once in the CSV file. After that I am running through this table line by line and check if the key already exists in the target table and then either make an insert or an update.

Does somebody know a better process?

I hope somebody can help me... :(

View 5 Replies View Related

Deleting Large Amount Of Data From The Table......

May 18, 2001

I need to delete data from a particular table which has more than half a million records. The data needs to be deleted is more than 200,000 records from the table. What is the best way to delete the data from the table other than importing into a temporary table and performing the same operation?

Let me know if the strategy to be followed is okay.

1. Drop all the triggers
2. Drop all the indexes
3. Write a procedure with a loop setting ROWCOUNT to 1000 and delete the records. ( since if I try to delete all the rows it will give timeout error )
The above procedure will delete 1000 records for each batch inside the loop till it wipes out all the data for the specified condition.
4. Recreate Indexes and Triggers.

Please let me know if there are any other optimal solution.

Thanx,
Zombie

View 2 Replies View Related

Urgent! Please Help! Deleting Data From A Huge Table

Mar 16, 2004

I have a huge table with 4 primary keys on it. I need to delete the data from this table ( approx. 5.6 millions records to be deleted). It takes a hell lot of time to delete it by normal query.
Can someone please suggest me a better way?
Any help will be appreciated.

View 14 Replies View Related

Reduce Table Size Without Deleting Data

Jul 20, 2006

We are using SQL Server 2000 Standard ed, sp4 on Server 2000 Advanced.We have one table that is three times as large as the rest of the database.Most of the data is static after approximately 3-6 months, but we arerequired to keep it for 8 years. I would like to archive this table (A), butthere are complications.1. the only way to access the data is through the application (they areimages produced by the application-built on Power-Builder)2. there are multiple tables refrencing this table and vise-versa3. we restore the entire db to two other servers for testing and trainingregularly4. there might be more complications that have not been thought ofCurrently, our only plan is to setup a seperate server with a copy of this dbon it and the application. Leave only the tables necessary to access the data,and if this 'archive' works, remove from production the data from the table Aand all references to the table A from rows on the other tables.I mentioned #3 because someone mentioned a third party tool that may be ableto pull the data from the table, archive it elsewhere, and at the same time,place a 'pointer' in the table to the new storage location. The tool theymentioned only works on Oracle and we have not explored beyond that yet.I am ready to explore ideas and suggestions; I am still new to the DBA world,I am out of ideas.Thank you!--Message posted via SQLMonster.comhttp://www.sqlmonster.com/Uwe/Forum...eneral/200607/1

View 1 Replies View Related

Massive Bulk Delete / Data Purge Problem

Jan 28, 2008

I've got a large MS Sql Server 2000 database that has 15 indexes, with roughly 180 million rows representing 240 GB worth of data. Due to the massive size of the database we are trying to purge it down to a smaller dataset, about 40 million rows, in order to speed up the query performance and to be able to defrag the indexes (which are 30-50% fragmented). To complicate the matter, this table is also a publisher in a transactional replication setup, with one subscriber. Also, the system needs to be up constantly so I'm only allowed about a 3-5 hour period to take an outage a week.

So far I've tested several methods of delete following all best practices (batch deletes, using indexes in delete's where clause), and have come up with deleting/commiting 500 rows at a time. The problem is that it still takes 3-4 seconds to delete this many rows, on a 8 GB RAM, 4 processor machine that is not currently used or replicated.

I'm at a loss on a way to pare down the data with a delete as the current purge script will take 7 hours a day for about 3 months. Another option I'm considering is to do a truncate and copy the data back over from the replicated database, but again this has its own set of problems, i.e. network latency and slow inset times. Yet another option would be to create a replica of the table on the production db, copy the data to it, then rename the table.

Any one have experience with purging such a massive amount of data? Any help would be greatly appreciated.

View 6 Replies View Related

Deleting Data From Sql 2005 Table But No Resource Released

Mar 7, 2008



I'm Working in a Simple picture Gallery On My web site.
When I add my pictures To the table Using Binary Writer and Delete
]

DELETE FROM [Photos]

WHERE [PhotoID] = @PhotoID

It From My Table this Transact Delete the Pictures
but After some work I found That My database File size is increassing day To day I'm very confused
so please tell me where is the problem ?

View 4 Replies View Related

Deleting Data In A Linked Table Is Not Supported By This ISAM.

Oct 18, 2007

Hi,

I am trying to delete a row in excel [Sheet1$], where this data in that row is used in a pivot table in same excel [Sheet2$] which should also get deleted, when i try to delete that row using "delete ... from [Sheet1$] " it is throwing an error message "Deleting data in a linked table is not supported by this ISAM. (Microsoft Office Access Database Engine)"
Can you please guide me in overcoming this error...........

Thanks in advance,

Warm Regards,
gchanduu

View 2 Replies View Related

SQL Server 2008 :: Deleting Data (rows) From Table To Reclaim Space?

Feb 11, 2015

I have a table 300+GB. it holds 10 years of Data. I need to delete 5 years of data and put it to another server so I can have more space.

If I delete 5 years of data, Transaction log gets so huge and size of the database even gets bigger because of the .ldf file which even gets bigger! I think I can shrink the log file and the data file. Is this the best way to do it?

View 8 Replies View Related

SQL 2012 :: Data Size Of Table Is Not Reduced Even After Deleting Millions Of Rows

Sep 21, 2015

I have deleted nearly 30 million rows from a table. But however when I used the sp_spaceused command to calculate the data occupied by the table I don't see any difference in the data size of the table. In fact the data has increased to few MBs after the deletion, but not much.

View 8 Replies View Related

DB Engine :: Deleting 1 Million Records From Transaction Table Of 10 Million Data On 24/7 Environment

Jun 12, 2015

I have a requirement to delete 1 Million records from a table having 10 Million data and it's being queried on 24/7 basis (don't have a downtime). how can I achieve that?

View 13 Replies View Related

Data Mining :: Deleting Old Data From Adventure Works 2012 With Powershell

May 3, 2015

I am trying to delete tables from data where the ModifiedDates older than 9 years in AdventureWorks2012 database . I get console notified that foreign keys are dropped but the delete statement is throwing errors. I am sure that somewhere the key constraints are not getting altered, but i'm not able to figure it out as i'm a relative beginner to T-SQL. The error and code:

The DELETE statement conflicted with the REFERENCE constraint "FK_SalesOrderHeaderSalesReason_SalesReason_SalesReasonID". The conflict
occurred in database "AdventureWorks2012", table "Sales.SalesOrderHeader
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
$option_drop = new-object Microsoft.SqlServer.Management.Smo.ScriptingOptions;
$option_drop.ScriptDrops = $true;

[Code] ....

View 3 Replies View Related

Power Pivot :: Deleting Data Content From Data Model

Sep 10, 2013

I don't know if the question has been nailed down.  Aside from deleting tables, can we delete the *content* of data within the tables.  It doesn't seem crazy that, if you can pull in data from a feed then you should be able to remove the content out again (without also destroying the user's meta-data work ).  Reasons for this include:

- Security (a user may not have rights to see *my* data and should go refresh their own)
- Size (workbook doesn't need to have GB's of irrelevant data saved to disk in a workbook if it was just useful during development phase to a pre-production data feed)
- Bad data (pre-production data feed is not good data)
- User-friendliness (data feed was refreshed 2 years ago and workbook was saved to file server.  Users shouldn't be presented with irrelevant data, but should get empty pivot tables until they go do their refresh)

Obviously Excel internally knows how to clear out PowerPivot data, given the prompt shown here: [URL] ....

But how does a user initiate this on their own (corruption aside)?

Previous time this question was asked, without a real resolution: [URL] ....

View 8 Replies View Related

Archive Data Instead Of Deleting It To Prevent 4GB Data Limit

Mar 14, 2008

We are running SQL Server 2005 express on Windows 2003. The database server gets significant amounts of data.



Because of the 4GB data limit we have a daily cron task which goes through and deletes data older then 90 days.



We would like a way to archive this data instead of deleting it. Is there any way to take data and compress it and store it in a different way, so that if needed, customers can query directly out from the compressed data? Cleary querying from compressed would be slower but that is ok.



Any other solutions that would allow us to archive data instead of deleting it? Thanks.

View 10 Replies View Related

Massive Inserts

Oct 15, 2004

Currenlty I have huge amounts of data going into a table.
I'm sending an xmldoc and using openxml with a cursor to seed them.

the question I have is whether to let duplicate keyed data rows bounce
and then check @@error and then do an update on the nokeyed field
or
to do a select on the keyed field and then do an insert or update based on the
selects results.

Speed is my goal.

View 3 Replies View Related

Massive .bak File

Feb 6, 2007

Rather than posting twice, I thought I would put both issues I'm having in one. Our server is Windows Server 2003 and we're running SQL Server 2005.

The first issue is this: We have several databases and I have scheduled their backups to run nightly which works just fine. A couple weeks ago, one of the databases .bak file grew from about 500MEG to 2GB overnight. Then, just a few days ago, it went from 2GB to 3.5GB. There is nothing unusual going on in the live db that would warrant such an increase in the .bak file. All the dbs are in the same backup job schedule but this is the only one affected. Additionally, I had autogrowth enabled on all the dbs but today disabled it for this particular db. Any ideas?

The second issue is my tempdb.mdf file on my C drive. It will go from just a few hundred KB's to 4.5GB overnight consuming most of what is left on my C drive. I'm afraid I'm in for a system crash if it continues. I have to stop SQL Server and restart it to clear the size. Is there a way to move the location of the tempdb.mdf file to my F drive?

I don't know if these two issues are related or not but certainly would like to hear from someone.

Sorry, in advance, for the large post.

Dave

View 20 Replies View Related

Massive Delete In DB

Sep 6, 2005

Hello,I have a huge database (2 GB / month) and after a while it is becomingnon-operational (time-outs, etc.) So I have written an SQL sentence(delete) that can reduce around 60% of the db size without compromisingthe application data needs. The problem is that when I execute it, thedb does reduce its size 60%, but the transaction log increases at thesame rate. Can I execute the sentence in a "commit" or"transaction" mode so to impede the SQL Server write in the log?Thanks for the help!Antonio

View 6 Replies View Related

Deleting A Table

Nov 9, 1998

i have a table that i think is bad -
how do i delete the table?
also i want to copy this same table from another database -
can i use enterprise manager to copy the same table in place of the deleted table?

View 1 Replies View Related

Deleting Row From Table

Jun 17, 2008

Hi guys,

I have a table which consist of 6,185 rows and from this table I want to delete 427 rows. I have placed the 427 rows in a separate table. So how do I delete these 427 from the original table(6185 rows).

I should also mention that the table does not have a primary key so I was thinking about something like this but it didn't work


delete o.CardNumber
,o.ref_no
,o.tran_date
,o.tran_val

from test6185 o

where exists (select h.CardNumber
,h.ref_no
,h.tran_date
,h.tran_val
from test427 h
where( h.ref_no=o.ref_no and
h.CardNumber = o.CardNumber and
h.tran_date =o.tran_date and
h.tran_val =o.tran_val
))

order by cardnumber


Thank much for any help

View 3 Replies View Related

Deleting Existing Data Before Loading New Data

Apr 10, 2007

I have a package which loads data from a flat file (csv) to 4 tables in a database.
Now, the load is incremental.

I want to clear the data of all 4 tables(in the database) before loading the data from flat file everytime.How can i do this?
Iam using 4 Oledb Destinations, 1 multicast, 1 source component to do this.
Also can it happen like a transaction? because if it deletes the existing data and couldnt load new data there will be a problem!.how to avoid this?

View 4 Replies View Related

SQL 7.0 Massive Row Locking Performance

Mar 3, 2000

When updating large sets a row at a time the performance is lacking in comparison to 6.5. When using PeopleSoft which uses cursors with a begin transaction with a loop inside and a commit after the loop completes, SQL 6.5 with Page locking could handle a 300,000 row transaction in 3-4 hours. 7.0 took 17.5 hours. The difference is 6.5 used 50,000 locks and 7.0 used 300,000 locks.

Does anybody have solution short of rewriting PeopleSoft ?

View 2 Replies View Related

Massive TRN File, But Small DB

Apr 5, 2006

Hi Everyone,

We have a large and active MSSQL 2000 database. Recently, after a rebuild of the server, we had a problem with the SQL service SQLSERVERAGENT. The service could not start as the service account lost local permission to the registry. During this time, all of the data being sent to the database from our application accumulated into the database .ldf file. By the time we were able to get the service restarted, our .ldf file was approx. 28 Gigs. When the service restarded, the .ldf file shrunk down to regular size,about 40 megs, and the .trx tlog file grew up to 28 gigs for that specific period (new file every hour).

The problem is, the database file (database.mdf) stayed about the same as it was before the service was restarted. When the .ldf transfered to the .trn none of the 28 gigs of data got stored in the database. What does this mean? Perhaps with the service stopped the application using the db saw problems and did not commit the data making it all useless? Or is it possible that the data in the .trn log just needs to be forced to commit to the .mdf???

Is there any way to verify the data in the 28 gig .trn file and figure out if we should get it stored to the database? If yes, how would we go about verifying it, and after that how would we force it to commit to the .mdf file? Am I on the right track here or is it not as I see it??

Thanks!
Mike

View 4 Replies View Related

General Question About Massive SP Use

Jul 2, 2006

Hi

Would you say that it's ok for a web site code to make ALL of it's access to a db through SP and views? And I mean everything including inserting new records and updating others with no use with SQL in the code.

The advantage would be very strict control over the access, but in order to achieve this it would take many many SP and views to cover all types of actions, can you think about a disadvantage except all the work creating those SP?? what about the server resources and performance? how demanding it would be?

Thanks,
Inon.

View 7 Replies View Related

Execute Massive SQL Statement

Jan 3, 2006

Hello,

I thought this was a neat solution I came up with, but I'm sure it's
been thought of before. Anyway, it's my first post here.

We have a process for importing data which generates a SELECT statement
based on user's stored configuration. Since the resulting SELECT statement
can be massive, it's created and stored in a text field in a temp table.

So how do I run this huge query after creating it? In my tests, I was
getting a datalength > 20000, requiring 3 varchar(8000) variables in
order to use the execute command. Thing is, I don't know how big it could
possibly get, I wanted to be able to execute it regardless.

Here's what I came up with, it's very simple:

Table is named #IMPORTQUERY, one field SQLTEXT of type TEXT.


>>
declare @x int, @s varchar(8000)

select @x = datalength(sqltext) / 8000 + 1, @s = 'execute('''')' from #importquery

while @x > 0
select @s = 'declare @s' + cast(@x as varchar) + ' varchar(8000) ' +
'select @s' + cast(@x as varchar) +
'=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery ' +
replace(@s,'execute(','execute(@s' + cast(@x as varchar) + '+')
, @x = @x - 1

set @s = 'declare @x int set @x=1 ' + @s

execute(@s)
<<

At the end, I execute the "@s" variable which is SQL that builds and
executes the massive query. Here's what @s looks like at the end:

>>
declare @x int set @x=1
declare @s1 varchar(8000)
select @s1=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
declare @s2 varchar(8000)
select @s2=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
declare @s3 varchar(8000)
select @s3=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
execute(@s1+@s2+@s3+'')
<<

View 4 Replies View Related

Massive Amoutns Of Reading

Jul 23, 2005

Our database server has started acting weird and at this point I'm eithertoo sleep deprived or close to the problem to adequately diagnose the issue.Basically to put it simply... when I look at the read disk queue length, thedisks queues are astronomical.normally we're seeing a disk queue length of 0-1 on the disks that containthe DB data and index. (i.e non clustered indexes are on a disk of theirown).Writes are just fine.Problem is, all our databases are on the same drive, and I can't seem tonail down which DB, let alone which table is the source of all our reads.Now, to really make things weirder.. during the busier times of the daytoday (say 1:00 PM to 4:00 PM) things were fine.At 4:20 PM or so it was like someone hit a switch and read disk queue lengthjumped from 0-1 up to 100-200+... with spikes up to 1500 for a split secondor so.What's the best way folks know to nail down this?Thanks.----

View 9 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved