Urgent! Please Help! Deleting Data From A Huge Table
Mar 16, 2004
I have a huge table with 4 primary keys on it. I need to delete the data from this table ( approx. 5.6 millions records to be deleted). It takes a hell lot of time to delete it by normal query.
Can someone please suggest me a better way?
Any help will be appreciated.
declare @error int, @rowcount int select @rowcount = COUNT(1) FROM STG_BCDR; while @rowcount > 0 begin BEGIN TRAN Deletion
[code]....
Above code i try to delete records batch by batch to avoid table locking at BCDR table.total records in this BCDR table is 40,000 records. However I run the code at execution plan, the BCDR table still clustered index scan which means that the locking still happend.
If i change the delete top (5000)...... to delete top (5).... then thre is clustered index seek, which is good..The problem here is each time only delete top 5 records which is means it will realy take very long time to remove those data.
how to cater the situation inorder for me to delete those huge data without table locking happend. If table locking happend , then other user will not be able to access this table at the same time.
I have to delete a ton of data from a SQL table. I have a unique identifier called the version. I would like to use if not in these versions then delete. I tried to using the statement below, but learned the hard way that it created an error this is the message I got....
Msg 9002, Level 17, State 4, Line 3...
The transaction log for database 'MonthEnds' is full due to 'ACTIVE_TRANSACTION'.
I was reading about truncate, I am not sure how I would do this or how I would setup the statement.
Delete Products where versions were not in (('48459CED-871F-4971-B888-5083990332BC','D550C8D3-58C7-4C74-841D-1C1675F19AE3','C77C7817-3F04-4145-98D3-37BB1610DB35', '21FE83FA-476D-4604-80EF-2ED57DEE2C16','F3B50B81-191A-4D71-A406-011127AEFBE1','EFBD48E7-E30F-4047-909E-F14DCAEA4181','BD9CCC41-D696-406B- 'C8BEBFBC-D362-4D0F-A555-B281FC2B3023','EFA64956-C2CF-41FC-8E21-F060597DAFCB','77A8DE56-6F7F-4490-8BED-AA6809B947EF','0F4C1E5F-B689-4DCB-
The data is automaticaly deleting from one perticular table at every night from last week onwords. I have created a delete trigger to find it out. But Nothing was recorded. There is no jobs except maintainance plans. Nothing in event viewer too. The database recovery model is simple. How can i solve this problem Please advise me to solve this problem
I have an entry form allowing customers to enter up to 15 skus (productid) at a time, so they can make a multiple order, instead of enteringone sku, then submitting it, then returing to the form to submit thesecond one, and so forth.From time to time, the sku they enter will be wrong, or discontiued, soit will not submit an order.Therefore, when they are done submitting their 15 skus through the orderform, I want a list showing them all of those skus that came back blank,or were not found in the database.I'm doing this by creating two tables. A shopping cart, which holds allthe skus that were returned, and a holding table, that holds all theskus that were submitted. I want to then delete all the skus in theholding page that match the skus in teh cart (because they are goodskus) which will then leave the unmatched skus in the holding table.I'll then scroll out the contents of the holding table, to show them theskus that were not found in the database.(confused yet?)So what I want to do is have some sql that will delete from the holdingtable where the sku = the sku in the cart. I've tried writing this, butit dosn't work.I tiried this delete from holding_table where sku = cart.skuI was hoping this would work, but it dosn't. Is there a way for me to dothis?Thanks!Bill*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
I can run a select to retrieve data using a prefix 'a' for the specific table involved. However when I try to run a delete using the same criteria it fails telling me
Msg 102, Level 15, State 1,.......Line 1
Incorrect syntax near 'a'
The Select statement looks like:
select count(*) from schema.table a where a.customer_id=1234
The Delete looks like:
delete from schema.table a where a.customer_id=1234
What am I doing wrong here? and how can I prefix the table, because the command I want to run is much more complicated than the example above and it needs the prefix
Hi i have to delete the master table data without deleting the child table records,is there any solution for this, parent table has relation with the child table. regards vinod.t.v
I need to delete data from a particular table which has more than half a million records. The data needs to be deleted is more than 200,000 records from the table. What is the best way to delete the data from the table other than importing into a temporary table and performing the same operation?
Let me know if the strategy to be followed is okay.
1. Drop all the triggers 2. Drop all the indexes 3. Write a procedure with a loop setting ROWCOUNT to 1000 and delete the records. ( since if I try to delete all the rows it will give timeout error ) The above procedure will delete 1000 records for each batch inside the loop till it wipes out all the data for the specified condition. 4. Recreate Indexes and Triggers.
Please let me know if there are any other optimal solution.
We are using SQL Server 2000 Standard ed, sp4 on Server 2000 Advanced.We have one table that is three times as large as the rest of the database.Most of the data is static after approximately 3-6 months, but we arerequired to keep it for 8 years. I would like to archive this table (A), butthere are complications.1. the only way to access the data is through the application (they areimages produced by the application-built on Power-Builder)2. there are multiple tables refrencing this table and vise-versa3. we restore the entire db to two other servers for testing and trainingregularly4. there might be more complications that have not been thought ofCurrently, our only plan is to setup a seperate server with a copy of this dbon it and the application. Leave only the tables necessary to access the data,and if this 'archive' works, remove from production the data from the table Aand all references to the table A from rows on the other tables.I mentioned #3 because someone mentioned a third party tool that may be ableto pull the data from the table, archive it elsewhere, and at the same time,place a 'pointer' in the table to the new storage location. The tool theymentioned only works on Oracle and we have not explored beyond that yet.I am ready to explore ideas and suggestions; I am still new to the DBA world,I am out of ideas.Thank you!--Message posted via SQLMonster.comhttp://www.sqlmonster.com/Uwe/Forum...eneral/200607/1
I'm Working in a Simple picture Gallery On My web site. When I add my pictures To the table Using Binary Writer and Delete ]
DELETE FROM [Photos]
WHERE [PhotoID] = @PhotoID
It From My Table this Transact Delete the Pictures but After some work I found That My database File size is increassing day To day I'm very confused so please tell me where is the problem ?
I am trying to delete a row in excel [Sheet1$], where this data in that row is used in a pivot table in same excel [Sheet2$] which should also get deleted, when i try to delete that row using "delete ... from [Sheet1$] " it is throwing an error message "Deleting data in a linked table is not supported by this ISAM. (Microsoft Office Access Database Engine)" Can you please guide me in overcoming this error...........
I have a table 300+GB. it holds 10 years of Data. I need to delete 5 years of data and put it to another server so I can have more space.
If I delete 5 years of data, Transaction log gets so huge and size of the database even gets bigger because of the .ldf file which even gets bigger! I think I can shrink the log file and the data file. Is this the best way to do it?
I have deleted nearly 30 million rows from a table. But however when I used the sp_spaceused command to calculate the data occupied by the table I don't see any difference in the data size of the table. In fact the data has increased to few MBs after the deletion, but not much.
I need to delete the duplicate rows from a table. How to do that in SQL server 7.0 ? If possible write an example, so that it will be much useful for me..
Hi, I have a table in which I will insert several redundant data. Don't ask why, is Integration services, it only reads data and inserts it in a SQL table. THis way, I have a SQL table with several lines repeating them selves. What I want to do is create a procedure that reads the distinct data and inserts it in another table, but my problem is that I am not able to select data line by line on the original table to save it in local variables and insert it on the another table, I just can select the last line. I've tried a while cycle but no succeed. Here is my code: create proc insertLocalizationASdeclare @idAp int, @macAp varchar(20), @floorAp varchar(2), @building varchar(30), @department varchar(30)select @idAp = idAp from OLTPLocalization where idAp not in (select idAp from dimLocalization)select @macAp=macAp,@floorAp=floorAp,@building=building,@department=department from OLTPLocalizationif (@idAp <> null)beginInsert into dimLocalization VALUES(@idAp,@macAp,@floorAp,@building,@department)endGO This only inserts the last line in the "oltpLocalization" table. O the other hand, like this:create proc aaaaasdeclare @idAp as int, @macAp as varchar(50), @floorAp as int, @building as varchar(50), @department as varchar(50)while exists (select distinct(idAp) from OLTPLocalization)begin select @idAp =idAp from OLTPLocalization where idAp not in (select idAp from dimLocalization) select @macAp = macAp from OLTPLocalization where idAp = @idAp select @building = building from OLTPLocalization where idAp = @idAp select @department = department from OLTPLocalization where idAP = @idApif (@idAp <> null)begin insert into dimLocalization values(@idAp,@macAp,@floorAp,@building,@department)endendgo this retrieves every distinct idAp in each increment on the while statement. The interess of the while is really selecting each different line in the OLTPLocalization table. I did not find any foreach or for each statement, is there any way to select distinct line by line in a sql table and save each column result in variables, to then insert them in another table? I've also thought about web service, that reads the distinct data from the oltpLocalization into a dataset, and then inserts this data into the dimLocalization table. Is there anything I can do?Any guess?Really needing a hand here!Thanks a lot!
I need to periodically import a (HUGE) table of data from an external data source (not SQL Server) into SQL Server, with the following scenarios: Some of the records in the external data source may not exist in SQL.Some of the records in the external data source may have a different value at different imports, but this records are identified univocally by the same primary key in the external datasource and in SQL Server.Some of the records in the external data source may be the same in SQL.
Due to the massive volume of the import, I would like to import only the records which are different from what I have in SQL Server (cases 1 and 2 above). In fact case 2 is the most critical.
I thought of making a query with a left outer join between the data in the external data source table (SOURCE) and the data in the SQL Server table (DESTIN). The join is done on the respective primary keys (composed keys of up to 10 columns) and one of the WHERE conditions will be that the value in SOURCE is different from the value in DESTIN.
The result of this query would be exactly what I need to import. How to do this in SSIS??? I couldn't figure out how to join tables in different data sources yet.
In fact I cannot write a stored procedure to do that, since one of the sources is in a datasources not SQL Server. I have seen the Lookup transformation in this article http://www.sqlis.com/default.aspx?311 but this is not exacltly what I want to do. Another possibility is to use the merge join, but due to the sorting I believe its performances would be terrible!
We have a daily process, which copies millions of rows of data from one DB to another over Linked Server. Just checking on the best practise, are there more efficient ways than the Linked server to copy millions of rows of data from one DB to another? I checked bulk insert but that transfers the data from the file to DB not DB to DB.
I have a requirement to delete 1 Million records from a table having 10 Million data and it's being queried on 24/7 basis (don't have a downtime). how can I achieve that?
Hello. I have a SP. I am passing two variables to it that are comma seperated lists (ids) so that I can use an IN clause. But since I cannot use these CSV in an IN clause directly I wrote a temp table that first parses the CS list and creates a temp table with a single column of IDS.Then in a second temp table I am selecting the data using enrollment_id IN (Select enrollment_id From #temptable)Then the SP selects data from the second temp table.This works exactly the way it is supposed to work. But when I call the SP from ASP.NET it doesn't return any data. I just cant seem to return data by selecting from a temp table. And my ASP.NET code is correct because I commented out everything and just returned the parameters I was passing and that worked. So does anyone know how to select data from a temp table in a SP when executed from ASP.NET?I even tried Table Variables and same thing. It works great in SQL but when called from ASP.NET it doesnt return any data. I am guessing it's probably losing its scope. Any quick help would be appreciated. Thank you.Here is a snippet of the SP (even this fails):CREATE PROCEDURE [dbo].[returnEnrollments](@organization_id int, @test_item_list ntext, @enrollment_list ntext)As--select @test_item_list as tilist, @enrollment_list as elistDeclare @report_start_date datetime;Declare @report_end_date datetime;Declare @test_item_id_list varchar(8000);Declare @enrollment_id_list varchar(8000);DECLARE @TestItemID varchar(10), @Pos intDECLARE @EnrollmentID varchar(10)--Set @IDList = @test_item_id_list;--Set @EnrollmentIDList = @enrollment_id_list;Set @test_item_id_list = Cast(@test_item_list As varchar(8000))Set @enrollment_id_list = Cast(@enrollment_list As varchar(8000))Select @report_start_date = report_start_date, @report_end_date = report_end_date From organization Where organization_id = @organization_id;-- Create a temp table to store all the enrollment idsIF OBJECT_ID('tempdb..#EnrollTemp') IS NOT NULLDROP TABLE #EnrollTempCREATE TABLE #EnrollTemp (enrollment_id int NOT NULL)SET @enrollment_id_list = LTRIM(RTRIM(@enrollment_id_list))+ ','SET @Pos = CHARINDEX(',', @enrollment_id_list, 1)IF REPLACE(@enrollment_id_list, ',', '') <> ''BEGIN WHILE @Pos > 0 BEGIN SET @EnrollmentID = LTRIM(RTRIM(LEFT(@enrollment_id_list, @Pos - 1))) IF @EnrollmentID <> '' And @EnrollmentID > 0 BEGIN INSERT INTO #EnrollTemp (enrollment_id) VALUES (CAST(@EnrollmentID AS int)) --Use Appropriate conversion END SET @enrollment_id_list = RIGHT(@enrollment_id_list, LEN(@enrollment_id_list) - @Pos) SET @Pos = CHARINDEX(',', @enrollment_id_list, 1) ENDENDSelect top 20 * From #EnrollTemp Edit: Just used a DataReader instead of a datatable and it works. Dont know why it would fail with a datatable.
I need to alter a table (expand the column size for varchar(10) to varchar(255)) and the table has 200 million rows. Please suggest me the best and the fastest method to achieve it. The database is on SQL 7.0
Hello i want to ask about the huge table(table with many tera records) backup time cost , any one can help me please in determining the time cost nearly
Hi all, I'm trying to create a temporary table with one of the columns with a datetime datatype. Sql server 7.0 is giving me an error: Error 195 date is not a recognized function name. Example of my create table looks something like this: create table #temp_table(column_one int, column_two(15,2), Column_three datetime)
i have 4 tables, each consist of app. 10000000 rows.They have same columns (fTime[datetime] and bid[money]).What i wanna do is to collect all of datas into one of the tables, in ascending order by fTime.
I have a table with 52 million rows which resides on Primary file group in my database. Because of huge number of rows the performance has gone very down and I would like to break the table into parts.
Can anyone suggest me the steps for doing the same and the number of parts that should be made. It is named as Account_Transactions and contains information of Policies in an insurance database.
I have a table with about 80 columns and 400 millions records. Each columns has different responses that I need to get frequency for. I need to get counts for each response from all the columns... I have a query that does it, but it will run forever... what is the best way to do so?
My starting query:
select res, sum(cnt) from ( select col1 res, count(*) as cnt from table1 with (nolock) group by col1 union all select col2 res, count(*) as cnt from table1 with (nolock) group by col2
........................
select col80 res, count(*) as cnt from table1 with (nolock) group by col80 )a group by res
Hi,I've an application, lets call it simply "A", which creates in a Microsoft Sql Database two huge tables.Lets call them "table1" and "table2"It safes really much data into this tables.After application "A" has finished another application is executed which deletes this two tables.Then application "A" is started again and it will create this two tables again, but the amount of data becomes bigger.It can only proceed if the tables were deleted completely before and the database is empty. This is the procedure which I repeat very often, but everytime the amount of data becomes bigger (table1 and table2 becomes bigger).A couple if times it works fine, but once it seems data becomes too big and application "A" fails. Mostlikely because the data wasnt removed correctly / completely. This is my code of deleting the two tables, maybee there is something I have to change:</p><p> try { SqlConnectionStringBuilder builder = new SqlConnectionStringBuilder("Server=mycomputerdbname;Integrated Security=SSPI;" + "Initial Catalog=testing"); builder["Server"] = "(local)dbname"; builder["Connect Timeout"] = 10; builder["Trusted_Connection"] = true; builder["Initial Catalog"] = ((ComponentConfiguration)this.componentConfig).Persistency.DatabaseName; SqlConnection sqlconnection = new SqlConnection(); sqlconnection.ConnectionString = builder.ConnectionString; sqlconnection.Open(); SqlCommand cmd1 = new SqlCommand("DROP TABLE table1"); // TO Do delete all tables SqlCommand cmd2 = new SqlCommand("DROP TABLE table2"); // TO Do delete all tables cmd1.Connection = sqlconnection; cmd2.Connection = sqlconnection; cmd1.ExecuteNonQuery(); Thread.Sleep(7000); cmd2.ExecuteNonQuery(); Thread.Sleep(7000); sqlconnection.Close(); Thread.Sleep(3000); } catch { }</p><p> </p><p> Thanks for help! mulata
My day started with loading huge volume of data and my data flow task failed to do so.
My data flow has a flat file connected to a OLEDB target. This is a one to one mapping. My source file contains 50 lac records and it is of 500 MB in size.
I'm processing the data with all the default buffer settings. I have 4 CPUs in my server.
the system process DTSDebug.exe is utilizing more than 2GB page size. My average CPU usage being 70% when one of those CPU s is hitting 100% utilization.
I'm very new to SSIS. So, please provide me some info how do i set my buffers and do we have any PDF for performance and tuning in SSIS ?
Do we have any bulk load transformation in SSIS to load into DB2UDB ?
Actually in my transformation i am transferring huge amount of data.
i have been using oledb command finally to dump my incoming data to respective tables.
For Example :
if you have two tables
table 1,table 2
in my incoming data i have a lookup and check for two unique columns with that of the unique columns in the table 1.if the record does not exsist i try inserting a record into table 2 and get the unique filed of the record and store that in particular column of table 1.
the data is very large an is this the better why or any suggesstions do let me know..