Tracking Forums, Newsgroups, Maling Lists
Home Scripts Tutorials Tracker Forums
  Advanced Search
  HOME    TRACKER    MS SQL Server






SuperbHosting.net & Arvixe.com have generously sponsored dedicated servers and web hosting to ensure a reliable and scalable dedicated hosting solution for BigResource.com.







Fastest Mass Update


I am attempted to find the quickest way to perform an update of 4-6 million rows in a sql table. Our customers do not have enough disk space to do this update in 1 batch. I have attempted the following while loop while commiting every 100000 updates, but the update still takes hours to perform. Anyone have ideas on how to make this work faster? Any ideas would be greatly appreciated.

declare @oid varchar(13)
declare @count int
set @count = 0
set @optimizeSize = 100000
select @oid = min(foo.object_id) from foo where x = 1
while @oid is not null
begin
update foo wehre foo.object_id = @oid
select @oid = min(foo.object_id) from foo where x = 1 and foo.object_id > @oid
set @count = @count + 1
IF @count = @optimizeSize
begin
commit transaction
set @count = 0
end
end


View Complete Forum Thread with Replies
Sponsored Links:

Related Messages:
Mass Update In SQL
I have a Hits table that tracks the hits on the id of a Link table:
Hits:
int linkId (foreign Key to Link)
datetime dateCreated
varchar(50) ip
We recently had to merge Links from different systems that are implemented similarly.  As a result, all the linkIds are now wrong in the Hits table because the ids all changed.  I managed to track down all the old ids and their associated new ids and have it in a table that I call joined_links
joined_linksint oldId
int newId
 
So, how do I do a mass update of these linkIds in the Hits table in SQL?  I know I could do it in .NET, but I'd rather not write an app to do that runs thousands of update statements.  There's gotta be a way to do it something like this:
 
UPDATE Hits h SET h.linkId = (SELECT newId FROM joined_links WHERE oldId=h.linkId)
but obviously I don't have visibility of that linkId in the subselect...  A Loop maybe?

View Replies !   View Related
Mass Update From ASP To SQL 7
I have an ASP form that users enter a store's product qty's into, then update. I have sp in SQL 7 that handles append, insert, update, but I need to give the users an option to mass update many stores at once when they all have the same product qty's. I've never done this. Can someone show me please?

TIA,
Bruce Wexler

View Replies !   View Related
Mass Update
This is a run of the mill application that moves orders from one table to another. There are 2 tables, Ordersummary & HstOrders.
Ordersummary has the following columns...
Identifier
FollowupId
OrderNumber
OrderReference
OrderReferenceOrigin
......
......
HstOrders has the following columns...
Identifier
OrderNumber
OrderReference
OrderType
......
......
The above two tables are bound by Identifier. After each month end, Orders are moved from Ordersummary to HstOrders.

Now my task is to update all rows in OrderSummary with the order details as seen in HSTOrders for ordertype = 'CREDIT'. OrderReferenceOrigin(in Ordersummary) should be updated with the value of Orderreference(from hstorders).

I have to update each row at a time & I need to write a cursor for mass updates where ordersummary.identifier = hstorders.identifier.

Can someone please help with in writing this update statement as I never wrote a cursor.

View Replies !   View Related
Is It Possible To Mass Update Stats?
SQL Server 2000 on Win2k

I'm fairly new to SQL Server and I'm just wondering if it's possible to Update Statistice for all indexes somehow? I'm looking at the Update Statistics command and it doesn't seem to be possible.

The situation we have is a reporting DB that basically has all it's tables truncated and remade every night by some DTS jobs that import from another datasource and change the data and build some denormalzed tables etc.
Some of the large Insert operations go from taking 8 mins to taking several hours sometimes and updating the stats seems to fix the problem for a while. So I'd like to make sure the optimizer has all the latest stats for all tables.

Any other advice would be appreciated.

Cheers.

View Replies !   View Related
Mass Update On Table With Trigger
Hi,I need to update a field in about 20 records on a table. The table hasan update trigger (which updates the [lastedited] field whenever arecord is updated). As a result I'm getting an error: "Subqueryreturned more than 1 value.", and the update fails.Is there a way in the stored procedure to handle this issue?thanks for your help.Paul

View Replies !   View Related
Why Triggers Don't Fire On Mass Update
If I have a trigger on a field in a table, and I update one record trigger fire properly. If I do a update to that same field on all records in the table the trigger does not fire. I the error in the trigger, or do I need to change my update statement?

View Replies !   View Related
Fastest/best Way To Handle Update
I have a master table which has demographic data such as name, dob, location along with a primary key id. It will have about 10-12000 records. We get a refresh file every hour which may or may not have corrections for these records hourly with about 3,000 records. I put this data into a table. This data should be considered always to be correct. To handle the update to the master table I need to create an update process. I can take one of two approaches, just update all the records in the master table regardless if they are correct or not, or do some type of left join on those that do not match (in other words, only update the ones where thae names or dob don't match) There is an underlying update trigger on the patient master which will also fire if these values are changed. An opinions on a best approach?

View Replies !   View Related
Fastest Way To Update 20 + Million Records
Hello,
What is the fastest way to update 20million records in our database.
I have tried to do a simple update statement like this:
update trail_log with (tablockx, holdlock)
set trail_log .entry_by = users.user_identity
from users
where trail_log.entry_by = users.user_id

but it take 10 plus hours to run since it cannot commit the transactions until the very end. So was was thinking that I need to commit in batch like after 50K but that is slow as well.
Set rowcount 50000
Declare @rc int
Set @rc=50000
While @rc=50000
Begin
Begin Transaction
update trail_log With (tablockx, holdlock)
set trail_log.entry_by = users.user_identity
from users
where trail_log.entry_by = users.user_id
and trail_log.entry_by not like '%[0-9]%'
Select @rc=@@rowcount
--Commit the transaction
Commit
End
go
I have let the above statement run for 1.5 hours and it only update 450000 rows. Any ideas...
Maybe I'm doing it wrong. Please Help!!

View Replies !   View Related
Mass Insert / Update External Data Into Internal SQL Database
Hola!I'm currently building a site that uses an external database to store all the product details, and an internal database that will act as a cache so that we don't have to keep hitting the external database to retrieve the products every time a customer requests a list.What I need to do is retrieve all these products from External and insert them into Internal if they don't exist - if they do already exist then I have to update Internal with new prices, number in stock etc.I was wondering if there was a way to insert / update these products en-mass without looping through and building a new insert / update query for every product - there could be thousands at a time!Does anyone have any ideas or could you point me in the right direction?I'm thinking that because I need to check if the products exist in a different data store than the original source, I don't have a choice but to loop through them all.Cheers,G. 

View Replies !   View Related
The Fastest Way To Perform An Update ... Advice Needed :)
Hi all,

I have a situation where my Visual C# application presents a number of fields. In order to update a student object, I wish to call a stored proc. 1 or more fields can be updated... And If one is left null, then I don't want to update it, but instead I want to keep the old value.

I am really wondering if I am approaching this the right way. The following stored proc does what I want it to do, however I'm thinking there may be a faster way...

Here it is:

-- Update a student, by ID.

DROP PROCEDURE p_UpdateStudent

CREATE PROCEDURE p_UpdateStudent

@ID INT,

@NewFName VARCHAR(25),

@NewOName VARCHAR(25),

@NewLName VARCHAR(25),

@NewDOB DATETIME,

@NewENumber VARCHAR(10),

@NewContactAID INT,

@NewContactBID INT

AS

BEGIN

SET NOCOUNT ON;

-- DECLARE THE OLD VALUES

DECLARE @FName AS VARCHAR(25)

DECLARE @OName AS VARCHAR(25)

DECLARE @LName AS VARCHAR(25)

DECLARE @DOB AS DATETIME

DECLARE @ENumber AS VARCHAR(10)

DECLARE @ContactAID AS INT

DECLARE @ContactBID AS INT

-- Get all of the old values

SELECT @FName = FName FROM TBL_Student WHERE ID = 10000

SELECT @OName = OName FROM TBL_Student WHERE ID = 10000

SELECT @LName = LName FROM TBL_Student WHERE ID = 10000

SELECT @DOB = DOB FROM TBL_Student WHERE ID = 10000

SELECT @ENumber = ENumber FROM TBL_Student WHERE ID = 10000

SELECT @ContactAID = ContactAID FROM TBL_Student WHERE ID = 10000

SELECT @ContactBID = ContactBID FROM TBL_Student WHERE ID = 10000



-- USE ISNULL to set all of the new parameters to the provided values only if they are not null

-- Keep the old ones otherwise.

SET @NewFName = ISNULL(@NewFName, @FName)

SET @NewOName = ISNULL(@NewOName, @OName)

SET @NewLName = ISNULL(@NewLName, @LName)

SET @NewDOB = ISNULL(@NewDOB, @DOB)

SET @NewENumber = ISNULL(@NewENumber, @ENumber)

SET @NewContactAID = ISNULL(@NewContactAID, @ContactAID)

SET @NewContactBID = ISNULL(@NewContactBID, @ContactBID)

-- Do the update

UPDATE TBL_Student

SET FName = @NewFName,

OName = @NewOName,

LName = @NewLName,

DOB = @NewDOB,

ENumber = @NewENumber,

ContactAID = @NewContactAID,

ContactBID = @NewContactBID

WHERE

ID = @ID

END

GO

So yeah it works. But As you can see I wish to keep an old copy of the values to perform checks pre update....

Is there any faster way, or am I on the right track? I need a pro's advice :) (before i write all of my procs!!)

Thanks all.



Chris







View Replies !   View Related
Mass DTS Modification
I have a series of DTS packages.
Each package has 20 queries.
Each query has a server name.
Is there a way to change the servername without editing each query in each DTS package.
I'd like to copy the template DTS package, then perform the modification.

Thanks

View Replies !   View Related
Mass SQL Inserts - Performance :-(
Hi,

I have an user table with a single integer column. No indexes, no identities, nothing. I have to insert 600,000 rows via a client app. In tests, using BCP/Bulk Insert/DTS all runs OK (sub 3 seconds). However the app takes 5000 rows a second [considerably slower]. I can mimic this slow perfomance, within DTS, by removing the 'Fast Load' & 'Batch' options.

Question= why would the SQL insert run slower on one server and as fast as BCP/Bulk Insert/DTS on another? What can I check on the 'slow' running server? May there be a file version anomally ??

Version = SQL-2000 SP4

Any help much appreciated !!!

View Replies !   View Related
Mass Updates In SQL Server
Does anyone know what the best way to do mass updates in SQL server is? I am currently using the methodology suggested in this article

http://www.tek-tips.com/faqs.cfm?fid=3141

But the article is assuming that once I update a field it is going to have a value that is NOT NULL. So I can loop through and update the rows that have a NOT NULL value. But my updated rows do contain NULL values, in this case what is the best way to go about this???

***************************************
Here is my code. I want to avoid using Upd_flag becos
after the following code runs I need to reset that flag
before I run my next query
***************************************

--Set rowcount to 50000 to limit number of inserts per batch
Set rowcount 50000

--Declare variable for row count
Declare @rc int
Set @rc=50000

While @rc=50000
Begin

Begin Transaction

--Use tablockx and holdlock to obtain and hold
--an immediate exclusive table lock. This usually
--speeds the insert because only one lock is needed.


update t_PGBA_DTL With (tablockx, holdlock)
SET t_PGBA_DTL.procedur = A.[Proc code],
t_PGBA_DTL.Upd_flag = 1
FROM t_PGBA_DTL
INNER JOIN CPT_HCPCS_I9_PROC_CODES A
ON t_PGBA_DTL.PROC_CD
= A.[Proc code]
WHERE t_PGBA_DTL.Upd_flag = 0


--Get number of rows updated
--Process will continue until less than 50000
Select @rc=@@rowcount

--Commit the transaction
Commit
End

View Replies !   View Related
Mass Mailing Through SQL Server
Hi friends,
Any idea about mass mailing system using SQL Server .Pls get back to me.
thanx and regards
Chinmay

View Replies !   View Related
MASS INSERT FROM VB.NET PROGRAM TO DB
I have the following problem. I need to insert 100.000 records (50Kb each) in one operation from a VB program into a SQL Server 2005 database. All of these records will be ready for inserting at the same time.
How to make this insert as one big transaction instead of 100.000 small ones?

View Replies !   View Related
How To Best Deploy Mass Packages ???
Hi There

Most of the time my solutions consist of 1 or 2 packages and config files work well.

Now i have a solution with about 50 packages i have to move to QA.

However a config file has the package ID, so even thoug they use the same data source connection. I would have to create and change 50 config files.

Data sources are kept in the package xml , so if i copy all the packages to QA , and then change the Global Data Source connection, i still have to open each package maually and save it again.

Surely there must be an easy way to move all 50 packages and have the connection now point to QA. But config files and global data sources dont do the trick, what am i missing here ?

Thanx

View Replies !   View Related
Mass Subscription Email Changes


The company i work for changed names and all email addresses within the company have changed. While it was OK for a while they are no longer going to be forwarding email to the old addresses to the new ones. There are Tons of subscriptions and tons ofemail addressesthat need to be changed to the new names.

If i could find the table with the TO: part of the subscription held in it i could just run an update on that field and it would be solved...however, i cannot find that field...


So,
Without going into every subscription in report manager, how can i change the email addresses? Any Suggestions?

Thanks in advance

View Replies !   View Related
Connection Timing Out When Trying To Do A Mass DTS Process
Here is the error message that I'm reciving:Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. I was wondering if anyone knew how to increase the timeout period.  The DTS package which I'm firing off is farily large and is exceeding the limit of the timout period -- I just need to be able to increase this limit.  Do I do this on my SQL Server or do I do this in my connection property.  Just as a FYI I have my connectionstring set up in my WebConfig file.  Thanks in advanceRB

View Replies !   View Related
Mass Import To Sql Server Through Asp.net Vb Style
I am making a prog that needs to import many records from a spreadsheet on a local computer through asp.net into sql server
is there a simple command to do this or is there information on how to do this
please give all the information that you can thank you

View Replies !   View Related
Software For Performing Mass Design Changes
Hi,

I was wondering if anyone knows of any way, including third party tools, to replicate a design change on a table across many different databases. I have written an ASP script that allows me to copy multiple tables to multiple databases in one go but I need something that will allow me to replicate the design of a table by comparing source and destination tables. So far I have scripted most of it but I have no idea on which system table to get the identity information and it seems there must be an easier way!

Any help would be appreciated,

Seoras

View Replies !   View Related
Transaction Log Backups (Mass Storage)
I want to backing up my hourly transaction log backups direct to a mass storage unit as opposed to the local server. However when trying to set this up it only gives me option of backing up to local drives, even though I have a drive mappping to the mass storage unit.. I'm there is a simple around this.Would appreciate any advice..many Thanks..

View Replies !   View Related
Tempdb Filling Up - Mass Deletes
Hi,

I've been having problems with my tempdb filling up, and causing all databases on the server to stop functioning properly. I've been removing alot of data lately (millions of rows), and I think this is the reason why my tempdb log is going thru an unusual load.

Whats the best way to make sure the tempdb doesnt fill up causing me major problems? I had temporarily turned off backups while I was having a new HD put in. Am I right in thinking that when a DB is backed up, the tempdb log is reduced in size? Should maintaining a daily backup solution help keep things under control ?

Thanks very much for any tips!

mike123

View Replies !   View Related
What Is The Fastest Way To Do This
right now I have a stored procedure that goes through each of the Line and Body fields using a cursor. The problem is that this method is very slow. How would you experts solve this problem? any Hints or suggestions?


BEFORE
EXAMPLEPartLineBodySeriesEngineYear
11234A,BWETC1998
25678991,93,94,95WET01997
3345656S,R5,6,12WENC1995


AFTER
EXAMPLEPartLineBodySeriesEngineYear

11234AWETC1998
11234BWETC1998

25678991WET01997
25678993WET01997
25678994WET01997
25678995WET01997

3345656S5WENC1995
3345656S6WENC1995
3345656S12WENC1995
3345656R5WENC1995
3345656R6WENC1995
3345656R12WENC1995

View Replies !   View Related
Mass Alter Table Fields - Script Help
Hello,I need to alter fields in all my tables of a given database, and Iwould to do this via a t-sql script.Example, I want to change all fields called SESSION_ID to char(6). Thefield is usually varchar(10), but the data is always 6 characters inlength. I have serveral fields that are fixed length that I want tomove from varchar to char.I believe I can find all the tables where the field exists usingselect * from INFORMATION_SCHEMA.columns where column_name ='SESSION_Id'but I dont know how to take this into an ALTER TABLE , ALTER COLUMNthat can be automatedTIARob

View Replies !   View Related
Sql Slowed To A Crawl After Deleting Mass Rows
Hi,

I've deleted about 3-4 million rows from one of my tables as the data was old and no longer needed. The problem is that now queries are runnning extra slow. I am in the process of running taras isp_ALTER_INDEX however its taking quite a long time and seems to be slowing things down even further while its running as expected. (It's been running 4 hours already, I have stopped it and will rerun it a slower traffic period for the db server)

Just wondering if I have the right approach here or if anyone else has any suggestions.

Thanks for your help!

View Replies !   View Related
Mass Table Structure Change For Column Order


Say you have an existing populated SQL 2005 database, with 700+ tables, and you want to just change the order of the columns inside every table. Short of manually building conversion scripts, anyone know an automated way to do this? I was thinking thru ways to do them all in one shot, and have tools like Erwin and DbGhost that could be used also. Basically moving some standard audit columns from the end of the tables to just after the PK columns.

Thanks, Bruce

View Replies !   View Related
Fastest Way Of Updating A Row
In relation to my last post, I have a question for the SQL-gurus.I need to update 70k records, and mark all those updated in a specialcolumn for further processing by another system.So, if the record wasKey1, foo, foo, ""it needs to becomeKey1, fap, fap, "U"iff and only iff the datavalues are actually different (as above, foobecomes fap),otherwise it must becomeKey1, foo,foo, ""Is it quicker to :1) get the row of the destination table, inspect all valuesprogramatically, and determine IF an update query is neededOR2) just do a update on all rows, but addingand (field1 <> value1 or field2<>value2) to the update querythat isupdate myTablesetfield1 = "foo"markField="u"where key="mykey" and (field1 <> foo)The first one will not generate new update queries if the record hasnot changed, on account of doing a select, whereas the second versionalways runs an update, but some of them will not affect any lines.Will I need a full index on the second version?Thanks in advance,Asger Henriksen

View Replies !   View Related
Is There Fastest Way To Encryption All SPs In All DBs?
Hi,
In my SQL server 7.0, I have got 250 store procedures in each database.
Before using them for my application, I want to ecyption all.
I must add "WITH ENCRYPTION" string in each SP in all database and it'll take me a long time. Is there fastest way to encryption all SPs in all DBs? Have anyone got an utility SP ( or anyway else) to do this?
Thanks in advance.

View Replies !   View Related
Fastest INSERT
What is the fast way a stored procedure can copy a table from a linked server?

I would like to tune this statement, possibly with hints or other logging options. Assume that table_A and table_B have the exact table structure and that I want to preserve table_A and all its indexes and contraints. The table will be truncated before this load, if that helps in any way.

insert into table_A select * from OpenQuery(Server,'select * from Table_B')

TIA, Mike

View Replies !   View Related
Fastest Way To Handle The Search
Hi!We have Sql Server 2000 in our server (NT 4). Our database have now about+350.000 rows with information of images. Table have lot of columnsincluding information about image name, keywords, location, price, colormode etc. So our database don?t include the images itself, just a path tothe location of every image. Keywords -field have data for example likethis:cat,animal,pet,home,child with pet,child. Now our search use Full-TextSearch which sounded like good idea in the beginning but now it have hadproblems that really reduce our search engine?s performance. Also searchresults are not exact enough. Some of our images have also photographer?sname in keywords -column and if photographer?s name is, for example, PeterMoss, his pictures appears in web-page when customer want to search "moss"(nature-like) -pictures.Another problem is that Full-Text Search started to be very slow when queryresult contains thousands of rows. When search term gives maximum 3000rows, search is fast but larger searches take from 6 to 20 seconds tofinish which is not good. I have noticed also that first search is alwaysvery slow, but next ones are faster. It seems that engine is just"starting" when first query started to run.Is there better and faster way to handle the queries? Is it better torebuild the database somehow and use another method to search than Full-Text Search? I don?t know how to handle the database other way when everyimage have about 10 to even 50 different keywords to search.We have made web interface and search code with Coldfusion. ColdfusionServer then take care of sending all queries to Sql Server.I hope that somebody have some idea how to speed up our picture search.--Message posted via http://www.sqlmonster.com

View Replies !   View Related
Fastest Way To Deduplicate A List
Im trying to dedupe a table with only one field on it. The table has40 million records in it. What is the fastest way?1) create a table with a unque constraint on it insert into thattable?2) create a table without a unique constraint on it and use insertinto table select distinct un from table2?3) another way?Michael

View Replies !   View Related
Best & Fastest SQL Server 200 Hardware
Can you let me know the best & Fastest SQL Server 200 Hardware for Web Applications.

I would like to know the hardware products from different companies.

Thanks in Advance.

View Replies !   View Related
Fastest Backup Config ...
Hi,

I have a production server that has an 8Gb db. It is dual Xeon with 5x HDD - 2 mirrored and 3 striped. db on stripe, log and OS on mirror. 2x Gb network cards.

The application goes slow (ie users notice) when a backup is running so i have placed a crossover cable from one NIC to a test server so that it can back up to a HDD on that server, and then to tape. The test server has 2xGb NIC and the link between the two servers is on a seperate subnet to However, in the first trial of this the back up and verify takes 3 minutes longer.

Is this because the target server doesnt have a disk stripe?

What is the best config for the production server (ie will a slower backup but to another server be less load to contend with the application)?

thanks
Fatherjack

View Replies !   View Related
Fastest Data Transfer
Hi,
1)I need to transfer 500 gb of data from one server to other, which is faster, DTS/BCP/Restore.
2)Which are the best methods for checking blocking, dead locks & Indexes!

Thanks you all in advance

Richard..

View Replies !   View Related
Fastest Way To Copy Data?
I've got a view that is driven from a 80 million record table in a data warehouse. I am trying to populate an aggregate table in a datamart, but am running into preformance problems. The datamart table needs to be updated daily. I understand there are many factors that effect performance, but in general would the fastest approach be:
1) Truncate the datamart table
2) Perform a bcp of the view to a text file
3) Bulk Insert to the datamart table

If you need more information to answer this please let me know.

Thanks,

Matt

View Replies !   View Related
Fastest Bulk Load
Hi All,

Im bulk loading a ton of data into MSSQL SERVER 2005 Standard Edition. I used to do this process in version 2000. It seems there is some more overhead in 2005. Is there a way to drop logging to almost null to speed up insert?

This is my current sql statment to load data.


EXEC sp_dboption 'my_stuff', 'select into/bulkcopy', 'true'

SET ANSI_WARNINGS OFF

BULK INSERT mystuff.dbo.[v1]

FROM 'c:myfile.txt'
WITH

(

FIRSTROW = 1,

FORMATFILE = 'c:scriptsv1.fmt',

MAXERRORS=2000,

ROWS_PER_BATCH=100000

)



Thanks,

Mike

View Replies !   View Related
Fastest Connection Method?
I am trying to find some info on the fastest connection transport for an app that is running on the same box as a SQL 2005 instance. The app does a large number of updates (high volume of data).. win32 using native ODBC. I am trying to find info on which connection mechanism is best... socket, pipes, etc. I read somewhere that there is a file mapped method available but i cannot find info on that either. Also, is there any performance difference between the old SQL OCBD drive and the new SQL Native Client OCBD drive?

Thanks.

View Replies !   View Related
Fastest Way To Add New Instances To SQL 2005?
Hi,

I'd like to know the fastest way to add named instances to SQL server 2005 (I need to add 5 named instances)

Thank you!

John

View Replies !   View Related
Fastest Connection Protocol?


Can anyone point to a reference which documents the pros and cons of the various connection protocols, such as Shared Memory vs. TCP/IP? I thought I saw something indicating that shared memory is fastest, which would explain why this protocol is tried first, but now I can't find it. This resource has information on creating connection strings, but not the advantages and disadvantages.

http://msdn2.microsoft.com/en-us/library/ms187662.aspx



View Replies !   View Related
Fastest Way To Extract Values From An XML Column
 Hi, I have a table with one XML type column. This column holds custom field information. Its used as a way of storing ad hoc fields and data that don't fit the DB design. <?xml version="1.0"?><contact><Reference>A39390TFH</Reference><Misc>all kinds of stuff go in here</Misc></contact>I want to provide a way of displaying the data stored in this column in the same DataTable as normal relational data from the same table. I have been able to achieve this goal BUT I want to know if the community had any ideas on how I could speed the process. I am using the XML  value() function. It allows me to extract the data I need.    SELECT Name, Number, Reference FROM

(

SELECT Name, Number, xmlvalues.value( ' (contact/Reference)[1] ', ' varchar(40) ' ) as Reference

From MyTable

WHERE Name = 'Some Dude'

AND xmlvalues is not null

) T

GROUP BY Name, Number, Reference anyone know if there are better xml functions to get this data out of the XMLColumn??????  There is no schema, because each xml fragment has different tags and different values.  Regards Niall 

View Replies !   View Related
Fastest Way To Insert 4000+ Records
I'm writing a program that allows users to upload a csv file.  This file is then seperated into 4 datatables based on certain criteria then each datatable is uploaded into my database.  I'm essentially adding new rows to the datatables then running an update command on each using a tableadapter.  The problem is that these csv files can be large and can end up with 4000+ new records being added to the database and the update commands take a while to do it.  I've sat for about five minutes on one run while it updated.  I put in some time variable to see where all the time is spent and it takes only seconds to parse the data and seperate into the datatables, but minutes on the update commands.  Is there a more efficient way to insert this much data?

View Replies !   View Related
Which Is The Fastest Way To JOIN Having Millions Of Records?
If there is 13 million records in one table and 40 thousand records in another table then what is the fastest way of joining these two tables????

This was a question to me from somebody to which i cudn't answer back properly. Cud anybody tell the answer with properreasons behind the answer??????

Thanx.

View Replies !   View Related
Fastest Way To Build A Database Out Of SQL Scripts
What is the fastest way to generate an SQL 2000 databaseout of SQL scripts.The SQL scripts contain the create tables, views, storedprocedures, triggers, constraints, and the tables DATArecords.What are my options? isql? osql? are there other ways?Thank you

View Replies !   View Related
Fastest Way To Insert Range/sequence
I'd like to use a stored procedure to insert large amounts of recordsinto a table. My field A should be filled with a given range ofnumbers. I do the following ... but I'm sure there is a better(faster) way:select @start = max(A) from tbl where B = 'test1' and C = 'test2'while @start <= 500000begininsert into tbl (A, B, C)values (@start, 'test1', test2')set @start = @start +1endanother question is, how to prevent that another user inserts the samenumbers into the field A?Thanks a lot for any help!ratu

View Replies !   View Related
Fastest Way To Remove Large Table?
We have a table that we BCP into, the data is then processed and inserted into its appropriate table.
Then the table or its data needs to be removed. This seems to be a very slow operation to remove
the table or table data. I have tried drop table, and truncate table and it takes nearly as long as
the bcp operation. The table has 12 million rows. I didn't think either operation wrote to the
transaction log except for page extent management. Why is the drop and truncate so slow. Suggestions?

View Replies !   View Related

Copyright 2005-08 www.BigResource.com, All rights reserved