Dirty Reads

Aug 1, 2001

If I'm doing a dirty reads and a someone updates a record when I'm trying to read it is it possible to read both the old and new records thereby retrieving two records?

View 2 Replies


ADVERTISEMENT

Checking For Dirty Reads/writes

Apr 17, 2008



Problem Statement........


Lets say user A accesses a record and is making an update to a column... next user B accesses the same record and makes an update to the same column and saves the data... how can user A check to see if an update has been made to prevent overwriting the data..

Is there a query statement that user A can write to check for this?

I understand locking can be used to prevent this but is there an alternative to locking.

View 5 Replies View Related

Can REPLICATION On SQL Server 2000 Allow Dirty Reads

Dec 1, 2005

All my queries are being blocked while the tables are being replicatedand it is causing some 2 minute blocking. Is there a way for theReplication to allow dirty reads because I really don't care aboutthat, I would rather have dirty reads than 2 minute waits.Thanks.

View 1 Replies View Related

Dirty Read

Jun 10, 1999

Can someone tell me what a dirty read is?

View 1 Replies View Related

Down And Dirty Database Infrastructure

Mar 29, 2006

Does any have a link, or know of an MS book(s) that details theunderlying database structure, tables, processes? Something thatexplains in detail how/why this stuff is configured and works, likeDDL, TDS, varchar, int, index, tables, normalization, DML, PrimaryKey/Foreign key.

View 4 Replies View Related

Keeping Dirty Data

Jan 24, 2008

This may be more of a data design question and not an ssis question, but figured folks here could have a good idea.....the organization I'm in has the business need of collecting data from outside organizations and tracking what data is bad and what data is good. When I say bad data I mean everything from things outside of range to absolute *** - characters in integer columns, integers in character columns, special characters, etc. The data comes in in the form of flat file so it's a free for all until it hits ssis & the db engine.

Eventually of course they work to get the data corrected at the source & resubmitted but in the meantime, they have the legitimate need of not only pushing the data into the database (dirty or not), but keeping all the bad stuff. I can't in good conscience make everything a varchar to catch everything - that would go against the database gods. IMO - I still must make an integer be an integer , characters are characters, etc. But what do I do with the junk? Any thoughts?

View 4 Replies View Related

Dirty Read With DataSource Control

Nov 19, 2007

Hi all,
Can anyone tell me how I can do a "Dirty Read" on a SqlDataSource Control? I'm affraid that the record locks are causing problems on the live system.
It is connected to an infromix db using odbc. Thanks. 

View 1 Replies View Related

BULK INSERT && DIRTY BUFFERS

Aug 26, 2005

Hi all

Using SQL 2000 MSDE

I'm bulk inserting about 3.200.000 records into a table

unfortunately all memory dissapears and never returns the dirty buffers count goes up to 48000 approx.

any ideas on how to rectify this ..... ?

View 2 Replies View Related

SQL Server 2005 SP2 'dirty' Install Results In SSIS Failure

Feb 28, 2007

Hi all,

Hope this helps someone. My experience is that an interrupted install of SP2 screws things royally, despite signifying success.

My initial attempt at installing SQL Server 2005 SP2 (64-bit) on our dev server resulted in massive corruption of at least one SSIS solution. Without the option to uninstall the SP2, the prospect of a complete reinstall/recovery totally ruined my day.

When I loaded the SSIS solution, all packages returned the following connection based errors "connection manager will not acquire a connection because the package OffLineMode property is TRUE". OffLineMode property was in fact set to the default "False" for all packages. Individual tasks were broken also. I did not search all packages, but FTP and File System tasks were all corrupted. Nor could I successfully create these tasks in a new packages, even within a new solution.

During the install I recived the following error:
Error Log:
Product : Database Services (MSSQLSERVER)
Product Version (Previous): 2153
Product Version (Final) :
Status : Failure
Log File : C:Program FilesMicrosoft SQL Server90Setup BootstrapLOGHotfixSQL9_Hotfix_KB921896_sqlrun_sql.msp.log
Error Number : 29506
Error Description : MSP Error: 29506 SQL Server Setup failed to modify security permissions on file D:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLData for user xxxxx. To proceed, verify that the account and domain running SQL Server Setup exist, that the account running SQL Server Setup has administrator privileges, and that exists on the destination drive.

The SP2 install continued and with the exception of database Services, all other components were installed successfully.

A very helpful person had posted this solution:

The problem is that there exists a file in this Data directory that the user running SP1 does not have permissions to modify. The workaround is to figure out which file this is (typically a user created DB or some backup program files) and to temporarily assign permission to that file(s). You can revert back to the intended permissions after running SP1.

Sure enough, there was a data and a log file with permission restricted to a single user. Granting permission to the account doing the install worked.

I ran the SP2 again and the database Services component "succeeded". In fact, as outlined, my SSIS packages were unusable.

I re-ran the SP2 install. No components were selected for update, so I manually rechecked each component and executed. This time the install completed successfully for all components, and much to my relief, SSIS functionality was restored.

Pretty scrappy install package IMHO. There are obviously dependencies between the different components being updated, but if these dependencies, and therefore integrity of the update, are broken, the installer nevertheless reports success.

Cheers, Jeremy







View 8 Replies View Related

Many Reads

Mar 25, 2008

Hi!
I was assigned to solve performance problems for an application. I fired up Sql Server profiler and started a trace. Downloaded Sql Server Trace Analyzer. It's a trial version so it's very limited. What I found is that one stored procedure generates almost 400 000 reads everytime it's used and it's used everytime the user wants to see his orders. I've tried to translate the t-sql to english from swedish, it looks something like this:


select top 100
o.orderid,
o.name,
o.latestdeldate,
os.name as OrderStatus,
os.orderstatusID,
p.placeID,
p.name as place,
p.address,
p.city,
a.name as worktype,
noOfActions=(select count(*) from actions a where a.order_orderid=o.orderid),
noOfServiceObjects = (select count(*) from Serviceobject s, Actions a where s.Place_PlaceID = o.Place_PlaceID and a.order_orderid = o.orderid and a.Serviceobject_serviceobjectid = s.serviceobjectid),
...
...
...


It has 8 select count(*) in the select statement then in the where statement it has 2 more select count(*).

I know it's very difficult for you to come up with a solution but do you know a better way than to use select count(*) everywhere? The count is used for to show different status flags on the website.

/Magnus

Jesus saves. But Gretzky slaps in the rebound.

View 19 Replies View Related

Reads / Writes Per Second.

Oct 30, 2006

How can You find the reads and writes per second of your hard drives in sql. I am reading my SQL book and it says that your average disk should have 125 or less i/o's. And it gave the forumal but as mentioned I don't know how to find the reads and writes.

View 4 Replies View Related

Reads, Clustering, Etc

May 1, 2008

server: QAT on clustering server ----> 23 seconds
----------------------------------------------------
SS 2000 developer edition SP4
win NT 5.2 (3790) SP4
MeM 7935 MB
processors 4
root directory C:program files...
use a fixed memeroy size 640 MB

reserve physical memory for sql server
minimum query memory 1024 kb

use all available processors
minimum query plan threshold for considering 5

PROFILER READS = 5234




server: MILLER ----> 3 seconds
----------------------------------------------------
SS 2000 developer edition no service pack
win NT 5.2 (3790) SP4
MeM 2047 MB
processors 4
root directory f:MSSQL$INAQAT

dynamically configure sql server memory

use all available processors
minimum query plan threshold for considering 5
PROFILER READS = 598





----------------------------------------------------
Making story short. I got an application that hits only 1 database called RECORDS. I'm getting different duration when running an application. 23 and 3 seconds.
Same database, same objects and same application.
SERVER QAT is our staging server, means lots of databases
SERVER MILLER is just a server i just assembled, means just one database (RECORDS).

Not sure if it's because it's a clustering server that is causing the issue nor the reads. If its the reads, what is causing it? Do you think is the how the memory is configured?. Will the experts pls stand up?

View 20 Replies View Related

More Reads Then Expected

Jul 18, 2006

So I€™m at a dead-end looking for the reason behind the following behavior. Just to make sure no one misses it, the 'behavior' is the difference in the number of reads between using sp_executesql and not.

The following statements are executed against a SQL 2000 database that contains >1,000,000 records in the act_item table. They are run using Query Analyzer and the Duration and Reads come from SQL Profiler

SQL 1:
exec sp_executesql N'update act_item set Priority = @Priority where activity_code = @activity_code', N'@activity_code nvarchar(40),@Priority int', @activity_code = N'46DF335F-68F7-493F-B55E-5F9BC6CEBC69', @Priority = 0

Reads: ~22000
Duraction: 250-350 ms

SQL 2:
DECLARE @Priority int
DECLARE @Activity_Code char(36)

SET @Priority = 0
SET @Activity_Code = '46DF335F-68F7-493F-B55E-5F9BC6CEBC69'
update act_item set Priority = @Priority where activity_code = @activity_code

Reads: ~160
Duration: 0 ms

Random information:

Activity_code is an indexed field on the table, although it is not the primary key. There are a total of four indexes on the table, none of which include the priority as one of the fields.
There are two triggers on the table, neither of which is executed for this SQL statement (there is an IF UPDATE(fieldname) surrounding the code in the trigger)
There are no foreign relationships
I checked (using perfmon) to see if a compilation/recompilation was happening. No it's not.
Any suggestions as to avenues that could be examined would be appreciated.

TIA

View 3 Replies View Related

Sqldatareader Reads From Second Row Skip The First Row.

Jun 20, 2007

Hello,
im using sqldatareader to read my data and whenever time i loop through the reader it starts from second row why is that?
here is my code:while (reader.Read()){hinfo.Name = reader["_name"].ToString();hi.Add(hinfo);}
i look at the database and i have two rows but its reading only the second row, skiping the first row 
 

View 2 Replies View Related

Log Reads In SQL Server 2005

May 31, 2006

I have a set of triggers that log the history of changes to a table - i.e. I record inserts, updates, deletes (pretty standard audit stuff I suppose). I want to also log reads on that data. If I were using sprocs for reading data, this would be relatively painless, but I am using an O/R mapper to handle my data access, which writes dynamic sql at runtime (and I don't want to use sprocs with it) and then sends it down to the DB. Is there a way I can intercept reads and log them to the same table I am logging other actions? I know very little about the new capabilities of SQL Server 2005, but I would think I could somehow, maybe via the new CLR capabilities or similar, get access to these types of events within the database? Anyone? I know I could always do this higher up in the application layers, but I would like to keep all of this at the database level if possible....Thanks,

View 1 Replies View Related

High Page Reads

Jan 17, 2002

SQL 6.5 - 5.5 Gig
NT

Hello,

Throughout the day our Document Management application generates high busts of physical page reads when users query the database.

What SQL configuration parameter(s) should I check/modify to insure that the database is performing at it's optimun during these bursts?

Thank You in advance.

View 1 Replies View Related

COUNT Of READS And WRITES On A 6.5 Db.

Jul 21, 2000

Is there a way to get a total count of all SELECT, UPDATE, DELETE and INSERT statements to a SQL Server 6.5 database during a 12 hour period? I'm thinking maybe someone knows of a software that reads the log or monitors the server... I've been looking at the performance monitor and, although it has good information, it doesn't capture DML's.

FYI - it's for capacity planning.

TIA,
Mike

View 1 Replies View Related

Reducing Reads Question

Aug 24, 2007

I'm trying to insert all the rows from a table to a new table.
(insert A select * from AA)
The reads on Profiler shows ar really high value (10253548).

First I created a unique clustered index and the reads shows (3258445), then I created a non clustered index expecting to have lower reads. Instead the reads shows (10253548).

I read creating indexes helps reduce reads. But it's not happening.
Any ideas what is going on?

=============================
http://www.sqlserverstudy.com

View 6 Replies View Related

Track Reads And Writes

Mar 5, 2008

GUys,

Is there any way track tables which have most no of reads and writes from a database of 400 tables.

Thanks

View 9 Replies View Related

Number Of Reads In Profiler

Jul 27, 2007

Hi,

Can any of can explain, what the "Reads" column in Profiler exactly mean ? I'm not comfortable with the explanation given in BOL.


"The number of read operations on the logical disk that are performed by the server on behalf of the event. These read operations include all reads from tables and buffers during the statement's execution"

For the same procedure with same parameters, if the server is not loaded much, the Reads are in a few hundreds, but when there are more than 1000 concurrent users, why it is going to millions ? What other parameters affecting this reads ? And how can I reduce it ?

Environment: SQL Server 2005 64-bit Enterprise Edition on Windows Server 2003 R2 Server x64 Enterprise Edition SP2


Thanks in Advance.

Regards

Babu

View 4 Replies View Related

Transaction Lockout Of Reads

Aug 28, 2006

Hi,

I have been seeing a basic scenario of a write transaction appearing to unexpectedly lock-out reading.

The database has isolation set to "READ COMMITTED".

The scenario is:

1.) Start a transaction (for doing a write)

2.) Do a read before the transaction (for doing the write) is committed (e.g. sqlCommand2.ExecuteReader()).

--> the code will appear to lock-up (then time out).

I see the same behavior if I step through the "write" code with the debugger (to a point after the transaction is started, but before it is committed), and run a "SELECT * FROM" type query from Microsoft SqlServer Management Studio.

Following is the code sample demonstates the issue.

Thoughts on how to resolve the issue (to let me do "read committed" reading of the database table)?

Thanks!

Andy







Module Transaction

Sub Main()

Dim exception1 As Exception

Try

' Create/Open Database Connection

Dim sqlConnection1 As New System.Data.SqlClient.SqlConnection("Server=GRB-AB;Database=Transaction;Trusted_Connection=True;")

sqlConnection1.Open()

' Start transaction

Dim sqlTransaction1 As System.Data.SqlClient.SqlTransaction = sqlConnection1.BeginTransaction()

' Set Parent record

Dim sqlCommand1 As New System.Data.SqlClient.SqlCommand("INSERT INTO Parent (Name) VALUES ('ParentValue');", sqlConnection1)

sqlCommand1.Transaction = sqlTransaction1

sqlCommand1.ExecuteNonQuery()

' Get Id from parent record (note: this code assumes the table was empty when this program starts)

sqlCommand1 = New System.Data.SqlClient.SqlCommand("SELECT Id FROM Parent;", sqlConnection1)

sqlCommand1.Transaction = sqlTransaction1

Dim parentId As Integer = CType(sqlCommand1.ExecuteScalar(), Integer)



'

' Do reading test to test concurrently reading table being written to

'

' Create/Open Database Connection for reading test

Dim sqlConnection2 As New System.Data.SqlClient.SqlConnection("Server=GRB-AB;Database=Transaction;Trusted_Connection=True;")

sqlConnection2.Open()

Dim sqlCommand2 As New System.Data.SqlClient.SqlCommand("SELECT Id FROM Parent;", sqlConnection2)

sqlCommand2.ExecuteReader()

Dim i As Integer

While (sqlCommand2.ExecuteReader.Read = True) ' <===== LOCKS UP HERE **************

i = i + 1

End While

'

' End reading test

'



' Set child record

sqlCommand1 = New System.Data.SqlClient.SqlCommand( _

"INSERT INTO Child (Name, ParentId) VALUES ('ChildValue', " & parentId.ToString & ");", sqlConnection1)

sqlCommand1.Transaction = sqlTransaction1

sqlCommand1.ExecuteScalar()

' Either 1.) commit transaction OR 2.) rollback transaction

Dim test As Boolean = False

If test = False Then

sqlTransaction1.Commit()

Else

sqlTransaction1.Rollback()

End If

sqlConnection1.Close()

sqlConnection2.Close()

Catch ex As Exception

exception1 = ex

End Try

End Sub

End Module

View 1 Replies View Related

SQL CLR Stored Proc Reads

Sep 19, 2006

I have written a same stored proc in TSQL and SQL CLR which basically takes an input xml and returns xml document. In SQL Profiler, I am getting reads value about five times more for the CLR. Does anyone has any idea why the CLR is doing more reads than TSQL? Thanks in advance.

View 5 Replies View Related

Set READ UNCOMMITTED (dirty Read) At Login.

Jul 23, 2005

Is it possible to set READ UNCOMMITTED to a user connecting to an SQL2000 server instance? I understand this can be done via a front endapplication. But what I am looking to do is to assign this to aspecific user when they login to the server via any entry application.Can this be set with a trigger?

View 1 Replies View Related

Query Logical, Scan Reads?

Dec 22, 2000

Hi Everybody,

One of my friend asked me "How do we reduce the query logical, scan reads
in SQL Server?".

I really don't know, how to answer him.

Can anybody explain me regarding this.

thanks,
Srini

View 2 Replies View Related

SQL 2012 :: Deadlocking Under Repeatable Reads

May 5, 2015

Just migrated application from Oracle to SQL and we are seeing alot of deadlocking and blocking. I did notice that app seems to be passing isolation level of repeatable read. Attached is a .doc of one of the deadlocks, is there a way to avoid these in the repeatable read isolation level? This example is a select with two tables, using NCI's that cover the where, and a insert doing just a clustered index insert. Is this simply try to get rid of the repeateable read if not needed, guess have to check with vendor on that or is there a way to get this to not deadlock using repeatable read?

View 2 Replies View Related

Audit Logout, High Reads

May 2, 2007

Hi,

I'm trying to figure out why my sqlserver is flatlined on the CPU. I'm doing a trace and can't help but notice this, with crazy high reads. I'm not sure what this is? It doesnt look good to me, altho maybe its nothing. Any info is much appreciated.

Thanks again!
mike123



Event Class/ TextData/ApplicationName/ LoginName/ CPU/ Reads/ Writes/ Duration

Audit Logout.Net SqlClient Data ProviderloginName3764129784 3146 156

View 3 Replies View Related

Profiler Not Reporting Reads Accurately

Jul 23, 2005

I am running a profiler trace against a database and noticed that thereads column always shows 0. When running the same trace againstanother machine I get back values in the reads column. I took a querythat profiler reported as having 0 reads and ran in in query analyzerwtih STATISTICS IO on and confirmed that there are in fact reads:Table 'tt_cawardalloc'. Scan count 1, logical reads 8, physical reads0, read-ahead reads 1.Table 'tt_clineitem'. Scan count 10, logical reads 125208, physicalreads 1540, read-ahead reads 2995.Table 'tt_contractitem'. Scan count 32, logical reads 676, physicalreads 0, read-ahead reads 0.Table 'tt_contract2'. Scan count 3, logical reads 121, physical reads4, read-ahead reads 0.I am on SQL 2000 sp3a. Any help appreciated.Thanks!

View 8 Replies View Related

Number Of Reads In Profiler Is Not The Same When Running The Same SP On Different PCs

Apr 17, 2007

I'm running the same query on two different PCs and tracing results in Profiler on my PC. When executing the query on PC1 - the total number of reads is 200000. When executing the same query on PC2 - the toal number of reads is 13000. It is almost 15 times more reads when executing query on PC2. The executed query is same on PC1 and PC2. Any reason for this?



I'm trying to analyse that query and reduce the number of logical reads as it's is too high but then I get completly different result on different PC.



Thanks.

View 5 Replies View Related

SQL Profiler: Interprete CPU, Reads, Duration

Jun 2, 2008

I ran Sql profiler and got the following results for a stored procedure

CPU of 1078;
Reads of 125464
writes of 0
Duration of 1882

how do i interpret the above results.
also what CPU and Duration is considered high and indicating a poor performing query.

I am using SQL Server 2005. Thanks

View 4 Replies View Related

Profile Logical Reads Versus STATISTICS IO

May 5, 2015

Why is there often such a dramatic discrepancy between the logical reads recorded in the trace file versus the output of STATISTICS IO?

In the server-side trace I have running I found a reporting procedure that shows having 136,949,501 reads (yes, in hundreds of millions), and it's taking 13,508 seconds to complete.

So I pull the code from the trace and execute it via SSMS - it runs < 1 second, and only generates about 4,000 reads (using various different parameters I get the same result)

The execution plan shows nothing unusual

View 5 Replies View Related

SQL Server 2008 :: Disk Reads And Writes

Nov 5, 2015

How can I measure the disk reads and writes to see if I need to add aditional disks to the server?

View 2 Replies View Related

Query Optimization: CPU Speed Or Logical Reads Better?

Dec 12, 2005

How do I determine which method I should use ifI want to optimize the performance of a database.I took Northwind's database to run my example.My query is I want to retrieve the Employees' Firstand Last Names that sold between $100,000 and$200,000.First let me create a function that takes the EmployeeIDas the input parameter and returns the Employee'sFirst and Last name:CREATE FUNCTION dbo.GetEmployeeName(@EmployeeID INT)RETURNS VARCHAR(100)ASBEGINDECLARE @NAME VARCHAR(100)SELECT @NAME = FirstName + ' ' + LastNameFROM EmployeesWHERE EmployeeID = @EmployeeIDRETURN ISNULL(@NAME, '')ENDMy first method to run this:SELECT EmployeeID, dbo.GetEmployeeName(EmployeeID) ASEmployee, SUM(UnitPrice * Quantity) AS AmountFROM OrdersJOIN [Order Details] ON Orders.OrderID =[Order Details].OrderIDGROUP BY EmployeeID,dbo.GetEmployeeName(EmployeeID)HAVING SUM(UnitPrice * Quantity) BETWEEN100000 AND 200000It's running in 4 seconds time. And here are theStatistics IO and Time results:SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time:CPU time = 17 ms, elapsed time = 17 ms.(3 row(s) affected)Table 'Order Details'. Scan count 1, logical reads 10,physical reads 0, read-ahead reads 0.Table 'Orders'. Scan count 1, logical reads 21,physical reads 0, read-ahead reads 0.SQL Server Execution Times:CPU time = 3844 ms, elapsed time = 3934 ms.SQL Server Execution Times:CPU time = 3844 ms, elapsed time = 3935 ms.SQL Server Execution Times:CPU time = 3844 ms, elapsed time = 3935 ms.SQL Server parse and compile time:CPU time = 0 ms, elapsed time = 0 ms.Now my 2nd method:IF (SELECT OBJECT_ID('tempdb..#temp_Orders')) IS NOT NULLDROP TABLE #temp_OrdersGOSELECT EmployeeID, SUM(UnitPrice * Quantity) AS AmountINTO #temp_OrdersFROM OrdersJOIN [Order Details] ON Orders.OrderID =[Order Details].OrderIDGROUP BY EmployeeIDHAVING SUM(UnitPrice * Quantity) BETWEEN100000 AND 200000GOSELECT EmployeeID, dbo.GetEmployeeName(EmployeeID),AmountFROM #temp_OrdersGOIt's running in 0 seconds time. And here are the Statistics IOand Time results:SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time:CPU time = 0 ms, elapsed time = 0 ms.Table '#temp_Orders0000000000F1'. Scan count 0, logicalreads 1, physical reads 0, read-ahead reads 0.Table 'Order Details'. Scan count 830, logical reads 1672,physical reads 0, read-ahead reads 0.Table 'Orders'. Scan count 1, logical reads 3, physical reads 0,read-ahead reads 0.QL Server Execution Times:CPU time = 15 ms, elapsed time = 19 ms.(3 row(s) affected)SQL Server Execution Times:CPU time = 15 ms, elapsed time = 19 ms.SQL Server Execution Times:CPU time = 15 ms, elapsed time = 20 ms.SQL Server parse and compile time:CPU time = 0 ms, elapsed time = 1 ms.(3 row(s) affected)Table '#temp_Orders0000000000F1'. Scan count 1,logical reads 2, physical reads 0, read-ahead reads 0.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 3 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 3 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 3 ms.SQL Server parse and compile time:CPU time = 0 ms, elapsed time = 0 ms.By the way why "SQL Server Execution Times"exists 3 times and not just one time?Summary:The first code is clean, 1 single SELECT statement buttakes 4 long seconds to execute. The logical reads arevery few compared to the second method.The second code is less clean and uses a temp table buttakes 0 second to execute. The logical reads are waytoo high compared to the first method.What am I supposed to conclude in this example?Which method should I use over the other and why?Are both methods good depending on which I prefer?If I can wait four seconds, it's better to reduce the logicalreads in order to provide less Blocking on the live tablesin a heavily accessed database?Which method should I choose on my own database?Calling a function like dbo.GetEmployeeName getsprocessed per each returned row, correct? That meansIf i had a scenario where 1000 records were to be returnedwould it be better to dump 1000 records to a temp tablevariable and then call a function to process each recordone at a time?Or would the direct approach without usinga temp table cause slower processing and moreblocking/deadlocks because I am calling the functionper each row as I am accessing directly from the tables?Thank you

View 1 Replies View Related

Reads And Writes To A Sql Server Database Per Table

Aug 1, 2006

Is it possible to find the reads/writes to a sql server table ?

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved