I am running an update query.It is taking long time. To find the estimated completion time i checked sys.dm_exec_request or sys.dm_exec_session or sp_who2 but there is no clue. It is showing as zero.
As in title. Is there any tool? I'm asking beceuse, I have some big bases, and processing may take a lot of time (at least few hours), and I'll be glad if it's possibility to know estimate time before query runs. I'm using MsSql 2005 Developer edition.
How to calculate estimated completion time of a job and what is the variance/difference in time based on previous job history. Looking for tsql query which can accomplish this.For example)...Daily a job is taking 10 mins to complete. However, today due to some reason, the job is running over an hour and still running. It could be a blocking issue or some performance issue on the server due to which the job is still running.
In such cases, using a tsql query or a stored proc which monitor these jobs every 3 mins (Configurable value), so every 3 mins , query has to check, if they are any jobs which are taking more time than its usual completion time/avg completion time in that case shoot an email using dbmail functionality i.e. sp_Senddbmail .. From there, DBA can dig further using waits or sql trace etc...
When viewing an estimated query plan for a stored procedure with multiple query statements, two things stand out to me and I wanted to get confirmation if I'm correct.
1. Under <ParameterList><ColumnReference... does the xml attribute "ParameterCompiledValue" represent the value used when the query plan was generated?
2. Does each query statement that makes up the execution plan for the stored procedure have it's own execution plan? And meaning the stored procedure is made up of multiple query plans that could have been generated at a different time to another part of that stored procedure?
I am writing a client application that shows estimated queries plans and statistics. I know how to obtain estimated plans by using SQL Server Management Studio. But is it possible to obtain by using database functions?
I have found sys.dm_exec_query_plan, but it seems that this function can only be used for executed (or executing) queries...
Hi,I have a question about estimated query execution plans that aregenerated in QA of MSSQL.If I point at an icon/physical operator in the estimated QEP, it showsmesome statistics about the operator.Is there a way to retrieve these statistics through a query, i.e., canthese statistics be available to the user?Also, is there a way to generate these statistics on my own?thanks in advance-TC.
Hi There,I have an update statement to update a field of a table (~15,000,000records). It took me around 3 hours to finish 2 weeks ago. After thatno one touched the server and no configuration changed. Untilyesterday, I re-ran it again and it took me more than 18hrs and stillnot yet finished!!!What's wrong with it? I can ran it successfully before. I have triedtwo times but the result was still the same.My SQL statement is:update [all_sales] aset a.accounting_month = b.accounting_monthfrom date_map bwhere a.sales_date >= b.start_date and a.sales_date < b.end_date;An index on [all_sales].sales_date is built successfully.A composite index on ([date_map].start_date, [date_map].end_date) isbuilt successfully.My server config is:SQL Server 2000 with Service Pack 3Windows 2000 with Service Pack 4DELL PowerEdge 6650 ServerDUAL XEON 1900MHz Processors2G RAM2G Page File on Drive C2G Page File on Drive DDELL Diagnostics on all SCSI harddisks were all PASSED.Any experts could simly give me a help????Thanks x 1,000,000,000
I have to create a task, and in that task i have check exection time of package and if it more then specified time, send a mail with custom message to some specific users.
or
Can i write a event which occur after specific time and send a mail?
HiI have have two linked SQL Servers and I am trying to get remote writesworking correctly (fast).I have configured the DB link on both machines to:Point at each others DB.I have security set up to map each others server loginsand Server Options: Collation Compatible, Data Access, RPC, RPC Out, UseRemote Collation all checkedMy problem is that when a SP performsBegin TransactionUpdate Local TableUpdate Remote TableCommit TranIt takes several seconds to complete. (about 7 seconds not acceptable tous)This is due to the remote update - how can I improve the response time?example of a stored procedures that takes timewhere ACSMSM is a remote (linked) SQL Server.procedure [psm].ams_Update_VFE@strResult varchar(8) = 'Failure' output,@strErrorDesc varchar(512) = 'SP Not Executed' output,@strVFEID varchar(16),@strDescription varchar(64),@strVFEVirtualRoot varchar(255),@strVFEPhysicalRoot varchar(255),@strAuditPath varchar(255),@strDefaultBranding varchar(16),@strIPAddress varchar(23)asdeclare @strStep varchar(32)declare @trancount intSet XACT_ABORT ONset @trancount = @@trancountset @strStep = 'Start of Stored Proc'if (@trancount = 0)BEGIN TRANSACTION mytranelsesave tran mytran/* start insert sp code here */set @strStep = 'Write VFE to MSM'updateACSMSM.msmprim.msm.VFECONFIGsetDESCRIPTION = @strDescription,VFEVIRTUALROOT = @strVFEVirtualRoot,VFEPHYSICALROOT = @strVFEPhysicalRoot,AUDITPATH = @strAuditPath,DEFAULTBRANDING = @strDefaultBranding,IPADDRESS = @strIPAddresswhereVFEID = @strVFEID;set @strStep = 'Write VFE to PSM'updateACSPSM.psmprim.psm.VFECONFIGsetDESCRIPTION = @strDescription,VFEVIRTUALROOT = @strVFEVirtualRoot,VFEPHYSICALROOT = @strVFEPhysicalRoot,AUDITPATH = @strAuditPath,DEFAULTBRANDING = @strDefaultBranding,IPADDRESS = @strIPAddresswhereVFEID = @strVFEID/* end insert sp code here */if (@@error <> 0)beginrollback tran mytranset @strResult = 'Failure'set @strErrorDesc = 'Fail @ Step :' + @strStep + ' Error : ' + @@Errorreturn -1969endelsebeginset @strResult = 'Success'set @strErrorDesc = ''end-- commit tran if we started itif (@trancount = 0)commit tranreturn 0
We are running a SQL Server 2008R2 64-bit database system on a Windows 2012 R2 64-bit Standard system. I have noticed in recent weeks that our differential backups periodically are taking longer than expected to complete. The usual amount of time is about one hour, but on several occasions, it has taken upwards to five hours. The nights when the job takes longer to complete are on Friday.
I did some checking online, and one possible reason for this issue is my scheduling the reindexing of the database on the morning of the Differential backup. For example, this past Friday the reindexing occurred at 1:00 AM with the Differential running at 10:00 PM that night.The article that I read suggested the reindexing, which takes several minutes, if that, to complete, should be scheduled to run just before the Full backup job.
I have a big table and want to make a plausibility check of it´s data.
Problem is, that my query stops, if there is an unexpected datatype in one of the rows. But that is it, what i want to filter out of my table with that query and save the result as new correct table.
How can i write a parameter to my query SQL Code, that if a error occurs, the querry resumes and the error line will not displayed in my final querry overview?
In my books and on the net, i don´t found something to this theme ;-(.
Hello--I do not know if this group can help me with SQL Server Reporting Services ... but here it goes. I have a report built on a query and I want to be able to send a parameter (@theWhere) that is a string which is a where clause "(MEGLOMART_TYP_CODE <> 'XYZ' AND MEGLOMART_STAFF_SHORTAGE > 25)" these strings can vary depending on what columns the user selects and what operators they want to use. Generation of proper SQL for the where clause has been verified, I just need to be able to pass these, is there any way to do this...see example query below and how I was planning on using the @theWhere variable...
Try this script to see what queries are taking over a second.To get some real output, you need a long-running query. Here's one(estimated to take over an hour):PRINT GETDATE()select count_big(*)from sys.objects s1, sys.objects s2, sys.objects s3,sys.objects s4, sys.objects s5PRINT GETDATE()Output is:session_id elapsed task_alloc task_dealloc runningSqlText FullSqlTextquery_plan51 32847 0 0 select count_big(*) from sys.objects s1, sys.objects s2,sys.objects s3, sys.objects s4, sys.objects s5 SQL PlanClicking on SQL opens the full SQL batch as a .txt file, including the PRINTstatementsClicking on Plan allows you to see the .sqlplan file in MSSMS========Title: Using a VB Script to show long-running queries, complete with queryplans.Today (July 14th), I found a query running for hours on a development box.Rather than kill it, I decided to use this opportunity to develop a scriptto show long-running queries, so I could see what was going on. (ReferenceRoy Carlson's article for the idea.)This script generates a web page which shows long-running queries with thecurrently-executing SQL command, full SQL text, and .sqlplan files. The fullSQL query text and the sqlplan file are output to files in your tempdirectory. If you have SQL Management Studio installed on the localcomputer, you should be able to open the .sqlplan to see the query plan ofthe whole batch for any statement.'LongestRunningQueries.vbs'By Aaron W. West, 7/14/2006'Idea from:'http://www.sqlservercentral.com/columnists/rcarlson/scriptedserversnapshot.asp'Reference: Troubleshooting Performance Problems in SQL Server 2005'http://www.microsoft.com/technet/prodtechnol/sql/2005/tsprfprb.mspxSub Main()Const MinimumMilliseconds = 1000Dim srvnameIf WScript.Arguments.count 0 Thensrvname = WScript.Arguments(0)Elsesrvname = InputBox ( "Enter the server Name", "Server", ".", VbOk)If srvname = "" ThenMsgBox("Cancelled")Exit SubEnd IfEnd IfConst adOpenStatic = 3Const adLockOptimistic = 3Dim i' making the connection to your sql server' change yourservername to match your serverSet conn = CreateObject("ADODB.Connection")Set rs = CreateObject("ADODB.Recordset")' this is using the trusted connection if you use sql logins' add username and password, but I would then encrypt this' using Windows Script Encoderconn.Open "Provider=SQLOLEDB;Data Source=" & _srvname & ";Trusted_Connection=Yes;Initial Catalog=Master;"' The query goes heresql = "select " & vbCrLf & _" t1.session_id, " & vbCrLf & _" t2.total_elapsed_time AS elapsed, " & vbCrLf & _" -- t1.request_id, " & vbCrLf & _" t1.task_alloc, " & vbCrLf & _" t1.task_dealloc, " & vbCrLf & _" -- t2.sql_handle, " & vbCrLf & _" -- t2.statement_start_offset, " & vbCrLf & _" -- t2.statement_end_offset, " & vbCrLf & _" -- t2.plan_handle," & vbCrLf & _" substring(sql.text, statement_start_offset/2, " & vbCrLf & _" CASE WHEN statement_end_offset<1 THEN 8000 " & vbCrLf & _" ELSE (statement_end_offset-statement_start_offset)/2 " & vbCrLf & _" END) AS runningSqlText," & vbCrLf & _" sql.text as FullSqlText," & vbCrLf & _" p.query_plan " & vbCrLf & _"from (Select session_id, " & vbCrLf & _" request_id, " & vbCrLf & _" sum(internal_objects_alloc_page_count) as task_alloc, " &vbCrLf & _" sum (internal_objects_dealloc_page_count) as task_dealloc " &vbCrLf & _" from sys.dm_db_task_space_usage " & vbCrLf & _" group by session_id, request_id) as t1, " & vbCrLf & _" sys.dm_exec_requests as t2 " & vbCrLf & _"cross apply sys.dm_exec_sql_text(t2.sql_handle) AS sql " & vbCrLf & _"cross apply sys.dm_exec_query_plan(t2.plan_handle) AS p " & vbCrLf & _"where t1.session_id = t2.session_id and " & vbCrLf & _" (t1.request_id = t2.request_id) " & vbCrLf & _" AND total_elapsed_time " & MinimumMilliseconds & vbCrLf & _"order by t1.task_alloc DESC"rs.Open sql, conn, adOpenStatic, adLockOptimistic'rs.MoveFirstpg = "<html><head><title>Top consuming queries</title></head>" & vbCrLfpg = pg & "<table border=1>" & vbCrLfIf Not rs.EOF Thenpg = pg & "<tr>"For Each col In rs.Fieldspg = pg & "<th>" & col.Name & "</th>"c = c + 1Nextpg = pg & "</tr>"Elsepg = pg & "Query returned no results"End Ifcols = cdim filenamedim WshShellset WshShell = WScript.CreateObject("WScript.Shell")Set WshSysEnv = WshShell.Environment("PROCESS")temp = WshShell.ExpandEnvironmentStrings(WshSysEnv("TEMP")) & ""filename = temp & filenameDim fso, fSet fso = CreateObject("Scripting.FileSystemObject")i = 0Dim cDo Until rs.EOFi = i + 1pg = pg & "<tr>"For c = 0 to cols-3pg = pg & "<td>" & RTrim(rs(c)) & "</td>"Next'Output FullSQL and Plan Text to files, provide links to themfilename = "topplan-sql" & i & ".txt"Set f = fso.CreateTextFile(temp & filename, True, True)f.Write rs(cols-2)f.Closepg = pg & "<td><a href=""" & filename & """>SQL</a>"filename = "topplan" & i & ".sqlplan"Set f = fso.CreateTextFile(temp & filename, True, True)f.Write rs(cols-1)f.Closepg = pg & "<td><a href=""" & filename & """>Plan</a>"'We could open them immediately, eg:'WshShell.run temp & filenamers.MoveNextpg = pg & "</tr>"Looppg = pg & "</table>"filename = temp & "topplans.htm"Set f = fso.CreateTextFile(filename, True, True)f.Write pgf.CloseDim oIESET oIE = CreateObject("InternetExplorer.Application")oIE.Visible = TrueoIE.Navigate(filename)'Alternate method:'WshShell.run filename' cleaning uprs.Closeconn.CloseSet WshShell = NothingSet oIE = NothingSet f = NothingEnd SubMain
I have a table which has a few fields, one being "datetime_traded". I need to write a query which returns the row which has the closest time (down to second) given a date/time. I'm using MS SQL.
Here's what I have so far:
Code:
select * from TICK_D where datetime_traded = (select min( abs(datediff(second,datetime_traded , Convert(datetime,'2005-05-30:09:31:09')) ) ) from TICK_D)
But I get an error - "The conversion of a char data type to a datetime data type resulted in an out-of-range datetime value.".
Does anyone know how i could do this? Thanks a lot for any help!
I have a stored procedure that will execute with less than 1,000 reads onetime (with a specified set of parameters), then with a different set ofparameters the procedure executes with close to 500,000 reads (according toProfiler).In comparing the execution plans, they are the same, except for the actualand estimated number of rows. When the proc runs with parameters that producereads that are less than 1,000 the actual and estimated number of rows equal1. When the proc runs with parameters that produce reads are near 500,000 theactual rows are approximately 85,000 and the estimated rows equal 1.Then I run:DBCC DROPCLEANBUFFERSDBCC FREEPROCCACHEIf I then reverse the order of execution by executing the procedure thatinitially executes with close to 500,000 reads first, the reads drop to lessthan 2,000. The execution plan shows the acutual number of rows equal to 1,and the estimated rows equal to 2.27. Then when I run the procedure thatinitially executed with less than 1,000 reads, it continues to run at lessthan 1,000 reads, and the actual number of rows is equal to 1 and theestimated rows equal to 2.27. When run in this order, there is consistency inthe actual and estimated number of rows and the reads for both executionswith differing parameters are within reason.Do I need to run DBCC DROPCLEANBUFFERS and DBCC FREEPROCCACHE on productionand then ensure that the procedure that ran close to 500,000 reads is runfirst to ensure the proper plan, as well as using a KEEP PLAN option? Or,what other options might you recommend?I am running SQL 2000 SP4.--Message posted via SQLMonster.comhttp://www.sqlmonster.com/Uwe/Forum...eneral/200609/1
Hello, i am making a Fulltextsearch on MS SQL Server 2005 (indexed, with "Contains"). Because of performance reasons i am only showing the first 200 rows mssql finds ("select top 200...:"). Is there any possibility to get the estimated totalnumber of all rows? i have heard something that is possible to get this in mssql-server. The server then estimates how many rows with that searchword could be in the whole database. google i.e. makes the same thing.... is that true? what do i have to do to get this? greetings and thx cpt.oneeye
When I generate an estimated execution plan from Management Studio, one of the things I often see in the execution plan generated is an 'Index Scan'. When I put my mouse over the 'Index Scan' graphic, I will see a window display with something called 'Output List' at the bottom of the window. Do I understand correctly that SQL Server will scan my index looking for values in each of the fields included in this output list?
Select 'PIT_ID' = CASE WHEN Best_BID_DATA.PIT_ID IS NOT NULL THEN Best_BID_DATA.PIT_ID ELSE Best_OFFER_DATA.PIT_ID END, Best_Bid_Data.Bid_Customer, Best_Bid_Data.Bid_Size, Best_Bid_Data.Bid_Price, Best_Bid_Data.Bid_Order_Id, Best_Bid_Data.Bid_Order_Version, Best_Bid_Data.Bid_ProductId, Best_Bid_Data.Bid_TraderId, Best_Bid_Data.Bid_BrokerId, Best_Bid_Data.Bid_Reference, Best_Bid_Data.Bid_Indicative, Best_Bid_Data.Bid_Park, Best_Offer_Data.Offer_Customer, Best_Offer_Data.Offer_Size, Best_Offer_Data.Offer_Price, Best_Offer_Data.Offer_Order_Id, Best_Offer_Data.Offer_Order_Version, Best_Offer_Data.Offer_ProductId, Best_Offer_Data.Offer_TraderId, Best_Offer_Data.Offer_BrokerId, Best_Offer_Data.Offer_Reference, Best_Offer_Data.Offer_Indicative, Best_Offer_Data.Offer_Park
from ( Select PITID PIT_ID, CustomerId Bid_Customer, Size Bid_Size, Price Bid_Price, orderid Bid_Order_Id, Version Bid_Order_Version, ProductId Bid_ProductId, TraderId Bid_TraderId, BrokerId Bid_BrokerId, Reference Bid_Reference, Indicative Bid_Indicative, Park Bid_Park From OrderTable C Where version = (select max(version) from OrderTable where orderid = c.orderid) and BuySell = 'B' and Status <> 'D' and Park <> 1 and PitId in (select distinct pitid from MarketViewDef Where MktViewId = 4) and Price = ( Select max(Price) From OrderTable cc where version = (select max(version) from OrderTable where orderid = cc.orderid) and PitId = c.PitId and BuySell = 'B' and Status <> 'D' and Park <> 1 ) and Orderdate = ( Select min(Orderdate) From OrderTable dd where version = (select max(version) from OrderTable where orderid = dd.orderid) and PitId = c.PitId and BuySell = 'B' and Status <> 'D' and Price = c.Price and Park <> 1 ) and OrderId = (select top 1 OrderId from OrderTable ff Where version = (select max(version) from OrderTable where orderid = ff.orderid) and orderid = ff.orderid and PitId = c.PitId and BuySell = 'B' and Status <> 'D' and Price = c.Price and Orderdate = c.Orderdate and Park <> 1 )
) Best_Bid_Data
full outer join ( Select PITID PIT_ID, CustomerId Offer_Customer, Size Offer_Size, Price Offer_Price, orderid Offer_Order_Id, Version Offer_Order_Version, ProductId Offer_ProductId, TraderId Offer_TraderId, BrokerId Offer_BrokerId, Reference Offer_Reference, Indicative Offer_Indicative, Park Offer_Park From OrderTable C Where version = (select max(version) from OrderTable where orderid = c.orderid) and BuySell = 'S' and Status <> 'D' and Park <> 1 and PitId in (select distinct pitid from MarketViewDef Where MktViewId = 4) and Price = ( Select min(Price) From OrderTable cc where version = (select max(version) from OrderTable where orderid = cc.orderid) and PitId = c.PitId and BuySell = 'S' and Status <> 'D' and Park <> 1 ) and Orderdate = ( Select min(Orderdate) From OrderTable dd where version = (select max(version) from OrderTable where orderid = dd.orderid) and PitId = c.PitId and BuySell = 'S' and Status <> 'D' and Price = c.Price and Park <> 1 ) and OrderId = (select top 1 OrderId from OrderTable ff Where version = (select max(version) from OrderTable where orderid = ff.orderid) and orderid = ff.orderid and PitId = c.PitId and BuySell = 'S' and Status <> 'D' and Price = c.Price and Orderdate = c.Orderdate and Park <> 1 )
) Best_Offer_Data ON Best_Bid_Data.Pit_Id = Best_Offer_Data.Pit_Id
I have found execution plan with significant difference between actual and estimated number of rows (roughly actual/2=estimated) in non-clustered index seek.Statistics are updated.
There is a stored procedure. It uses linked server. As we will be migrating to amazon cloud, our architect instructed not to replace linked server with openquery.
Sometimes when I do "alter database ABCD set partner failover" I get the following message: Nonqualified transactions are being rolled back. Estimated rollback completion: 100%.
In 99 percent of the cases after such message the first attempt to use an open connection would also raise an error such as "Exception: A transport-level error has occurred when sending the request to the server. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)"
After the first error all subsequent queries would run perfectly.
What can I do to make OnComplete and OnSuccess mean something in SQL Server 2000 DTS?I have this pretty simple package that imports data into a table from an XLS file, then runs an external console application which manipulates the data and drops into a different table.That data is then used to create a CSV file. If I execute each step individually, the whole thing works. But what it happening is that the end file is being created before the console app is finished.I have a workflow line (on complete) between the 2 processes, but that doesn't seem to mean anything. To run the external app I am just using an active x script task...CreateObject("WScript.Shell").Run "my file" Any advice?
I have a SQL Server 7 box that is shortly to be rebuilt completely (still on NT4, but with new RAID system), does anyone have any advice on how I can make the transition as painless as possible? Particularly, I want to maintain the backup, security and DTS structures as much as possible.
'The Partial modifier is only required on one class definition per project. Partial Public Class StoredProcedures ''' <summary> ''' Create a result set on the fly and send it to the client. ''' </summary> <Microsoft.SqlServer.Server.SqlProcedure> _ Public Shared Sub SendTransientResultSet() ' Create a record object that represents an individual row, including it's metadata. Dim record As New SqlDataRecord(New SqlMetaData("stringcol", SqlDbType.NVarChar, 128) )
' Populate the record. record.SetSqlString(0, "Hello World!")
' Send the record to the client. SqlContext.Pipe.Send(record) End Sub End Class
Given this code, how do I add other SqlMetaData Columns to the statement:
Dim record As New SqlDataRecord(New SqlMetaData("stringcol", SqlDbType.NVarChar, 128) )
---
Like this???
Dim record As New SqlDataRecord(New SqlMetaData("stringcol", SqlDbType.NVarChar, 128),New SqlMetaData("otherstringcol", SqlDbType.NVarChar, 128) )
My asp.net application is attached with SQL database, I record only time in my SQL database , field type is nvarchar. Now I want to do qurey by time and pull the result , but datatype of field in nvarchar, query does not giving me right result. some one tell me how Do I do query so I can get proper result. thanks maxmax
I have asp.net applicatin with SQL database communicating. in database I have date field and time field. Now I wan to do SQL query which can pull informatin on particular date between given start time to given end time Can some one show me sample SQL query so I can pull informatin on particular day between two times