Asp.net Worker Process Runs Out Of Memory When Using A Large Dataset

Feb 27, 2006

Hi,

I'm running an application on a server which grabs data from a database table on another server using SqlConnection, SqlDataAdapter and DataSet.

The application then updates every row in that DataSet's DataTable and the updates are saved back using DataAdapter. The code is pretty much straightforward code that you would find on MSDN documentation for using DataSets. The table contains a little over a million rows.

When I run the application, I get an error saying the Server Application is not available. Upon looking into the application event log, I get this message.

aspnet_wp.exe was recycled because memory consumption exceeded the
306 MB (60 percent of available RAM)

How do I get round this? I thought DataSets were supposed to handle large datatables comfortably without having memory issues.

-Thanks

View 1 Replies


ADVERTISEMENT

Merge Replication IIS Worker Process Error

Feb 25, 2006

Everyday between 18:00 and 20:00 nearly 1000 PDA Subsriber anonymously
synchronise via Merge Replication and at least two time he have the error :

IIS Worker Process
Faulting application w3wp.exe, version 6.0.3790.1830,
faulting module sscerp20.dll, version 2.0.7331.0,
fault address 0x000110f4.


And subscriber which synchronising meanwhile becomes suspect.


Can someone offer a suggestion as to the cause of and correction for
this
error?


Thanks,


Hakan G


Here is some details about our system:


Client Side
OS: Windows Mobile 2003 4.21.1088
DB: SQL CE 2.0
Microsoft SQL Server CE (ssce20.dll) 2.00.4415.0
Microsoft SQL Server CE Client Agent (ssceca20.dll) 2.00.4415.0
Development Tools: VB.NET 2003
Service Pack: .NET Compact Framework 1.0 SP3


Server Side
OS: Microsoft 2003 SP1
Internet Information Services (INETINFO.EXE) 6.0.3790.1830
(srv03_sp1_rtm.050324-1447)
IIS Worker Process (w3wp.exe) 6.0.3790.1830 (srv03_sp1_rtm.050324-1447)
HW:IBM XSERIES_346 Intel(R) Xeon(TM) CPU 3.60GHZ (2CPU) 5,00 GB RAM DB:
SQL CE 2.0
DB:SQL Server Standart Edition 8.00.2039(SP4)

SQL CE Server 2.0
Microsoft SQL Server CE Server Agent (sscesa20.dll) 2.00.7331.0
Microsoft SQL Server CE Replication Provider (sscerp20.dll) 2.00.7331.0


Merge Replication Properties
-----------------------------
status : 1
retention : 21
sync_mode : 1
allow_push : 1
allow_pull : 1
allow_anonymous : 1
centralized_conflicts : 1
priority : 100.0
snapshot_ready : 1
publication_type : 1
enabled_for_internet : 0
dynamic_filters : 1
has_subscription : 0
snapshot_in_defaultfolder : 1
alt_snapshot_folder : NULL


Merge Agent Profile:
parameter_name value
-------------------------------- ---------
-BcpBatchSize 100000
-ChangesPerHistory 100
-DestThreads 4
-DownloadGenerationsPerBatch 500
-DownloadReadChangesPerBatch 500
-DownloadWriteChangesPerBatch 500
-FastRowCount 1
-HistoryVerboseLevel 1
-KeepAliveMessageInterval 300
-LoginTimeout 15
-MaxDownloadChanges 0
-MaxUploadChanges 0
-MetadataRetentionCleanup 1
-NumDeadlockRetries 5
-PollingInterval 60
-QueryTimeout 300
-SrcThreads 3
-StartQueueTimeout 300
-UploadGenerationsPerBatch 100
-UploadReadChangesPerBatch 100
-UploadWriteChangesPerBatch 100
-Validate 0
-ValidateInterval 60

View 3 Replies View Related

Integration Services :: Perform Lookup On Large Dataset Based On A Small Dataset

Oct 1, 2015

I have a small number of rows in a dataset, Table 1.  There is a CLOB on a large dataset, Table 2.  They join on a PK.  I would like to retrieve this CLOB and add it to the data flow for Table1.  In short I want to emulate the following:

Table 1:  Small table without CLOB, 10 rows. 
Table 2: Large table with CLOB, 10,000,000 rows

select CLOB
from table2
where pk = (select pk from table1)

I want this to return the CLOBs for the small number of rows in Table 1.  The PK is indexed obviously so it should be a fast look up.

Table 1 and Table 2 live on different Oracle databases.  How do I perform this operation efficiently in SSIS?  It seems the Lookup and Merge Join wont do this.

View 2 Replies View Related

SQL In-Memory :: How To Reduce Memory Usage Without Killing Any Process

Aug 28, 2015

I have a Windows sever 2012 with sql server 2012 enterprise. Ram size is 22GB. Sometimes SQL sever takes 95% memory.My question, How to reduce memory size without killing any process because it's production server.So there are many background process is running. And,Is there any guides to learn why Memory is raise d so high and how to reduce it.

View 10 Replies View Related

MDX Query Runs Out Of Memory

Jan 15, 2008

Hi,


I am running the following MDX query through a DataReader and a Ado.Net Connection.

SELECT
{[Measures].[Deuda Total Nacional],
[Measures].[Deuda Total Nacional Maximo],
[Measures].[Cupo Nacional],
[Measures].[Porcentaje Utilizacion Maximo],
[Measures].[Pago Minimo Estado Cuenta],
[Measures].[Deuda Ultima Facturacion],
[Measures].[Dias Mora],
[Measures].[Dias Mora Maximo],
}
ON COLUMNS
,[Cuenta].[Cuenta].[Cuenta]*[Cuenta].[Rut].[Rut]*[Cuenta].[Dv].[Dv] ON ROWS

FROM [Bd Rtd]
WHERE [Tiempo].[Mes].&[2007-09-01T00:00:00]

The thing is, when I have about 10 thousand rows It runs in about 50 seconds which is good, but when I run this query and I have processed the cube with 100 thousand rows it runs out of memory and crashes.

I'm working in a shared development server with 1GB of memory for my project.

Is there any way to make it run anyway?? I mean even if it has to swap.


thanks


By the way when this thing goes into production it will have 1.5 million rows

View 2 Replies View Related

Clean Up Data - When Process Runs Next Time

Jul 25, 2006



In a integration project I am moving data from A to B.

First time is fine - since table B is empty.

However next time I run the process, I would like to delete all records in B before I run the project again.

What is the best way to delete / clean up data when you re run the process ?

Cheers, T







View 1 Replies View Related

Foreach Loop Runs Out Of Memory

Apr 2, 2007

Hi



I have a for each loop which steps through an ado recordset (approx. 5,000 rows), this passes two variables to an SQL statement which populates a second recordset (normally 8 to 10 rows). I use the second recordset in a dataflow task which was a simple Script which returns approximately 30 rows for inclusion in my destination table. The package runs for a while OK, although the loop appears to execute slowly, then I get the below message constantly repeated in the debug window.


[DTS.Pipeline] Information: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 174 buffers were considered and 174 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.
I have 2gb of virtual memory on my machine, and the recordsets are relatively small. Have I missed a seting some where?

View 17 Replies View Related

Dataset Query Runs Fine In VS, But Not On The Reports Server

Mar 28, 2008

I have created a lot of reports using this technique, but this is the first one that doesn't work. As there is absolutely nothing special about it, I can't figure out what the issue is.

I have one dataset that uses parameters chosen from two other dataset results. One dataset result runs just fine and returns values in reporting services, but the other returns blank in reporting services. In Visual Studio, both datasets return data.

I cannot for the life of me figure out what I've done that gives this result as I've never encountered it before and use this method of parametization quite frequently.

The dataset in question doesn't even join any tables, it's a direct select distinct field"1" from table"a".

I'm running SQL2005, SP2.

Thanks for any advice.

Margaret

View 6 Replies View Related

Visual Studio 2005 Runs Out Of Memory When Trying To Use SSIS Package

Jul 12, 2006

Visual Studio runs out of memory when trying to use SSIS package. I am trying to create and run a SSIS package that validates and imports some large xml files >200MB. Validation fails because Visual studio cannot open large files without running out of memory.

The SSIS package throws this error when I run the package..at the validation task.

Error: 0xC002F304 at Validate bio_fixed, XML Task: An error occurred with the following error message: "Exception of type 'System.OutOfMemoryException' was thrown.".

How do I increase the amount of RAM that VIsual Studio can use...I have plenty of RAM on my workstation >3GB, but VS chokes maybe around 100MB files?

Thanks,

Forrest







View 9 Replies View Related

A Connection Was Successfully Established With The Server, But Then An Error Occurred During The Login Process. (provider: Shared Memory Provider, Error: 0 - No Process Is On The Other End Of The Pipe.)

Apr 7, 2008

i'm going nuts with SQL server notification thing. I have gone throigh this artical which tells how to set user http://www.codeproject.com/KB/database/SqlDependencyPermissions.aspx. This article show how to create new user and setup for sql server notification.But In my case user was alredy existing in database. which is very common senario in most cases. So i did following( check the SQL script below) but then i get this error
"A connection was successfully established with the server, but then an error occurred during the login process. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)"
this my sql script
use [master]Go
-- Ensuring that Service Broker is enabled ALTER DATABASE [DatabaseName] SET ENABLE_BROKERGO
-- Switching to our databaseuse [DatabaseName]GO
CREATE SCHEMA schemaname AUTHORIZATION usernameGO
ALTER USER username WITH DEFAULT_SCHEMA = schemaname GO
/* * Creating two new roles. We're not going to set the necessary permissions  * on the user-accounts, but we're going to set them on these two new roles. * At the end of this script, we're simply going to make our two users  * members of these roles. */EXEC sp_addrole 'sql_dependency_subscriber' EXEC sp_addrole 'sql_dependency_starter'
-- Permissions needed for [sql_dependency_starter]GRANT CREATE PROCEDURE to [sql_dependency_starter] GRANT CREATE QUEUE to [sql_dependency_starter]GRANT CREATE SERVICE to [sql_dependency_starter]GRANT REFERENCES on CONTRACT::[http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification]  to [sql_dependency_starter] GRANT VIEW DEFINITION TO [sql_dependency_starter]
-- Permissions needed for [sql_dependency_subscriber] GRANT SELECT to [sql_dependency_subscriber] GRANT SUBSCRIBE QUERY NOTIFICATIONS TO [sql_dependency_subscriber] GRANT RECEIVE ON QueryNotificationErrorsQueue TO [sql_dependency_subscriber] GRANT REFERENCES on CONTRACT::[http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification]  to [sql_dependency_subscriber]
-- Making sure that my users are member of the correct role.EXEC sp_addrolemember 'sql_dependency_starter', 'username'EXEC sp_addrolemember 'sql_dependency_subscriber', 'username'

View 10 Replies View Related

Outlook Runs Very Slow Frequently Non-response, SQL Server Using Up To 1GB Memory - VISTA && Office 2007

May 14, 2008

When I launch Outlook, it takes forever for the program to finally open. With any inbound email, it stops processing whatever is underway at the time....and frequently there is a 2-3 second lag between keyboard input and what appears on the screen. SQLserver is usually consuming upwards of 1-gb of memory....help. Mike

View 1 Replies View Related

Question On Large Volume Of Training Dataset

May 10, 2007

Hi, all experts here,

Thanks a lot for your kind attention.

I have a question on training large volume of datasets. In this case, the training will take a long while to complete, is there anything we can do to improve that? I know, we obviously cant split the training dataset into different smaller datasets. What we can do to improve that?

Hope my question is clear for your help.

Thank you very much in advance for your advices and help and I am looking forward to hearing from you shortly.

With best regards,

Yours sincerely,

View 3 Replies View Related

Reporting Services :: Summarizing A Large Dataset

Apr 17, 2015

LOCALID - POSTCODE - GPCODE
PTO1395164 - DN34 1AB - G9999981
PTO1395164 - DN34 1AB - G9999981
PTO1395164 - DN34 1AB - G8909058
PTO1395164 - DN34 1AB - G8909058
PTO1395164 - DN34 1AB - G8909058
PTO1395164 - DN34 1AB - G8909058
PTO1395164 - DN34 1AB - G8909058
PTO1395164 - DN34 1AB - G8909058
PTO1395164 - DN34 1AB - G8909058
PTO1395164 - DN34 1AB - G8909058
PTO1395164 - TZ14 2AX - G8909058
PTO1395164 - TZ14 2AX - G8909058

The sample data above shows 1 customer with multiple episodes (different attend dates – not important here), during the course of these attendances they moved home and moved GP practice.

Is there a simple way in Access to show a summary of this eh PTO1395164 = 2 postcodes, 2 GP’s

THe ultimate aim would be to identify where a customer has changed postcode or GP within a selected timeframe and disregard the rest.

View 4 Replies View Related

Process Memory Has Been Paged Out

Jun 9, 2008

Hi all

I frequently see the following message on SQL Server log

2008-06-09 07:46:18.17 spid3s A significant part of sql server process memory has been paged out. This may result in a performance degradation. Duration: 0 seconds. Working set (KB): 1079156, committed (KB): 17156388, memory utilization: 6%.

What does it indicates and what appropriate action has to be taken to fix it.

The database runs on

SQL 2005 Dev 64-bit SP2 9.00.3042.00
Win 2003 standard x64 SP2 16GB RAM

Thanks.

View 10 Replies View Related

Process Memory Usage

Feb 21, 2007

I am using a tool to monitor SQL Server and Windows. It is warning me that:

Process 1004:services has a virtual address space of 1,846.20 MB. This is close to the Windows two gigabyte address space limit.

When locate the process 1004, it shows 15 threads that Elapsed time for all of them is 1d, 3hrs. The Thread state is Waiting and the Thread Wait Reason is "Waiting for an Execution Delay to be resolved".

I think that 1d, 3hrs is from the time I rebooted my server.

Should I take any action? How?

View 7 Replies View Related

Low Process Speed Large Scale Storage Question...

Nov 2, 2006

Hi,I'm implementing mass information procedures that is stored in a SQL Database.What methods and actions i need to take in order for the process to be faster in a case like this (its more then 1000000 rows).I'm trying also to improve the memory usage since i use the DataTable in C# and I'm looking for a better way to process the retrieved data. is there a better class or method thats improving the speed and preventing any memory leaks ?please advise....Thanks for any help,Lior S  ;)

View 1 Replies View Related

SQL Server 2012 :: Purge Process On A Large Table

Jan 9, 2014

I am attempting to do a rather simple purge task on a very large table. This task will need to take place daily and delete records older than 6 months out of the database. On first pass this will delete well over 130 million rows. I thought the best way to handle this is create a proc and call the proc from a SQL Agent Job that runs nightly. Here is an example of the script:

CREATE PROCEDURE usp_Purge_WCFLogger
AS
SET NOCOUNT ON
EXEC sp_rename 'dbo.logs', 'logs_work'
GO
SELECT * INTO dbo.Logs_Backup FROM dbo.Logs_Work WHERE TIMESTAMP < DATEADD(month, -6, GETDATE())

[Code] .....

View 3 Replies View Related

Memory Increasing For The SQLServer Process

Feb 3, 2006

Out techs informed me that they are getting reports of a system slow down. When they look, they find sqlserver.exe has lots of memory allocated to it. They reboot the server and then it runs okay for a few weeks. They tell me this just started happening recently.

SQLServer itself has not been touched in months. They are, however, starting to use one of the databases heavier.

I found a setting where you can set max_server_memory. Any problems if I set this to a value?

View 1 Replies View Related

SQL Server Process And Memory Usage

Jul 23, 2005

Hi AllSome my SQL Server are experience high memory usage.1. How can I detect which process which process cause the big memoryusage and not released?2. Which sql server components in this memory, and what are their usagedistribution?Any help will be appreciated.ThanksWillie

View 2 Replies View Related

Sql Server Process Memory Question

Feb 15, 2007

I got a Small Business Server 2003 running. It has 2 sqlserverprocesses. One of them is growing by 200mb every day. Does anyone havea clue to this. It's serving as a printserver, fileserver and exchangeserver. There is no specific use of the sqlserver. The antivirus isMCaffee

View 3 Replies View Related

Sql Server Process Memory Has Been Paged Out

May 20, 2008


Hi All

I see the following message in SQL Server logs. What does this indicates. What should I do to avoid this.


2008-05-20 01:25:02.12 spid2s A significant part of sql server process memory has been paged out. This may result in a performance degradation. Duration: 0 seconds. Working set (KB): 33920, committed (KB): 15142988, memory utilization: 0%.

The server configuration is

SQL 2005 Dev edition SP2 64bit
Win 2003 R2 SP2 Standard X64 editioin
RAM size is 16GB

Thanks.

View 4 Replies View Related

SQL Server Is Occupying Large Memory Space

Oct 3, 2001

In an Intranet Application using Win NT, Apache, Tomcat and SQL Server, the memory space used by SQL Server is drastically increasing and finally the system crashes. Nearly 40 people are accessing the system. The hardware configuration is P2 processor with 393 MB RAM and 2GB Virtual Memory. SQL Server,Web server and Servlet Engine are running on same machine.
Within three hours, SQL Server occupies 200M memory and the system perfomance comes down and finally the system stopes the tomcat servlet engine.
Anybody have any idea on this? We have nearly 1500 JSP pages,200 Bean files and 300 tables in SQL Server.

View 2 Replies View Related

TableDiff Out Of Memory Exception On Large Tables.

Sep 20, 2007

Hello,
I hope I am posting this in the right forum.

I am using tableDiff.exe to create a diff SQL script for a very large table (~4 million rows).


After a few minutes, I recieve a "System.OutOfMemoryException".

I have 4GB of ram on the machine executing the table diff.
The server is 32-bit, so adding ram is not an option.

I am executing the following command line:





Code Snippet

TableDiff.exe" -sourceserver "SERVER" -sourcedatabase "SourceDB" -sourcetable "Table1" -destinationserver "SERVER" -destinationdatabase "DestDB" -destinationtable "Table1" -f "C:TableDiffsTable1"

I have seen reports of other users executing tableDiff against 2million row tables.

Is there anyway to buffer tableDiff, so that I do not run out of memory on the server?

Could anything else be causing this error?

Thanks,
Dave

View 3 Replies View Related

Interrupt Processing Of A Large Insert Process In SQL Server 2000.

Jul 20, 2005

I'm running a resource-intensive stored procedure, which reads a filewith about 50,000 lines with a BULK INSERT into a temp table, thengoes through it and inserts a record for each line into another table.While this procedure is running, SQL server stops accepting any otherrequests coming from the website.Question:Is there a way to make SQL server "listen", or emulate an "interrupt"to other requests while in the middle of a long intensive process?I really appreciate your replies.Thank you,Oleg.

View 5 Replies View Related

SQL 2005 Process Growing Very Large Working With Visual Basic 6

Sep 25, 2006

We recently installed SQL server 2005 on a couple of our servers.  I use Visual Basic 6.0 at the moment and use ADO to connect to our various SQL servers.

I recently discovered on one of the new servers, that every time my programs runs, (every 4 minutes for 12 hours a day) the SQL process shown in task manager grows by 1-10 Megs.

The SQL process was at 776,912K when I rebooted this afternoon.  It started back up at 106,120K.

I am not doing anything differently than I did when my programs were talking to SQL 2000, and I have never seen this memory leak issue.  Is there something extra I need to do in SQL 2005 to finish/clear these SQL queries and not bog down SQL's memory?







An example of how I would connect and do a SQL transaction:

Dim cn as ADODB.Connection

Dim rs as ADODB.RecordSet


Set cn = New ADODB.Connection

Set rs = New ADODB.Connection

cn.Open strConnect

select1 = "select firstName, lastName from clients"
rs.Open select1, cn, adOpenKeyset, adLockOptimistic

If rs.EOF = False Then

    rs.AddNew

End If


rs!firstName = Trim(Text1(0))
rs!lastName = Trim(Text1(1))

rs.Update

rs.Close
cn.Close

At the end of the program's run I would:

Set cn = Nothing

Set rs = Nothing

View 3 Replies View Related

Changing The DataType Of A Coluumn In An In-Memory Dataset

Apr 11, 2005

Here is the issue. I have ReadOnly Access to a database. All of the Columns are set to NVARCHAR(1000) by default. I cannot change them. I want to load the DataSet into memory and change the DataType of the columns from NVARCHAR(1000) to INT(4). The data is in integer (i.e. 4,5,123) format (but stored as a string), but is coming across as strings. The charting software I am using won't implicitly convert these Strings to Int or Double. How can I change an entire column to Int?

View 3 Replies View Related

SQL 2012 :: Server Process Memory Has Been Paged Out

Apr 1, 2015

In my SQL Server Errorlog, I see the below error. The system has 8 GB of RAM with enough free RAM, something I can do to prevent this alert? (Note: I have no MIN/MAX memory set on this Instance)

A significant part of sql server process memory has been paged out. This may result in a performance degradation. Duration: 328 seconds. Working set (KB): 76896, committed (KB): 167628, memory utilization: 45%.

View 5 Replies View Related

SQL Server 2008 :: Large Binary Dataset - Database Or File System?

Jun 2, 2015

I have a well-structured but also very large binary data-set that is generated by a C++ application every five minutes. The data needs to be accessed by SQL applications. Since data is generated every five minutes, performance is key, both for write and read. The data set is about 500MB.If data is written to the file system, the write performance doesn't involve SQL server. For reading it, I have a CLR to read the portions of the data that I need based on offset and length. That works and is very fast. The problem is that data is stored in the file system, so it is not self-contained within the database.

A second option that I haven't explored yet, is to write the data into a table as VARBINARY(MAX). I would read the data using SUBSTRING with appropriate offset and length. Performance of SQL write/read of binary data of this size, and whether there is a third option I haven't thought off. I'm using SQL Server 2014.

View 5 Replies View Related

DTSX Package Continues To Throw Errors When Working With Large Dataset.

Jun 7, 2006

I have a dataset that is between 40-50K records that has to go through a process that is pre-defined. SSIS works just fine with the smaller sets even up to 20K but this job keeps blowing up saying something along the lines of cannot write to recordset destination. Does this make sense to anyone? The sever is a 2 processor with 2GB of ram. Physical memory usage spikes to about 1.6GB during the run but the processor never really gets above 30% usage. Does this product just not scale yet?

View 1 Replies View Related

Sizing A Pagefile On Servers With Large Amounts Of Memory

Sep 19, 2007

I know the standard Microsoft recommendation is to make the pagefile at least 1.5 to 3 times larger then the amount of physical memory. However, if you're talking about a server with lots of memory such as 16GB or 32GB, would following this rule be unnecessary. With SQL 2000 running on Windows 2000 Server or Windows Server 2003 I typically see pagefile usage no more then 12% for a 2GB pagefile. Anything over 15% means I need to look at other indicators to see if a memory bottleneck has developed. If I have 32GB of physical memory and make the pagefile only 1.5 x 32GB I have a 48GB pagefile. 10% of this is 4.8GB, which I would hope I never see consumed.

Any thoughts?

Thanks, Dave

View 11 Replies View Related

Casting An In-memory Resultset To A DataSet For Use In XML File Generation

Sep 13, 2006

I am trying to use the results of a query to build an XML file (this will eventually be data from a number of RDBMS's and will undergo transformations and unions prior to being saved as a package variable).

I have saved the results of my query to a result set variable (ADOResultSet) using the Recordset Destination.

I then try to use a script task to read the variable and create an XML file. I get a runtime error Unable to cast COM object of type 'System.__ComObject' to class type 'System.Data.DataSet'.

Public Sub Main()

'MsgBox(Dts.Variables("ADORecordSet").Value.ToString)

Dim mySet As DataSet

mySet = (CType(Dts.Variables("ADORecordSet").Value, DataSet))

mySet.WriteXml("C: est.xml", XmlWriteMode.WriteSchema)

Dts.TaskResult = Dts.Results.Success

End Sub

I am very unfamiliar with Visual Basic and am unsure of how to cast the COM object to a DataSet. Do I need to marshal the COM object?

Thanks

View 2 Replies View Related

Process Or SQL Transaction Takign Memory And Processor Time

Mar 22, 2002

Hi,

While watching through performance monitor the processor time often goes high above the memory.

Could you please tell me how to find out which process is doing that.

Thanks
John Jayaseelan

View 3 Replies View Related

A Significant Part Of Sql Server Process Memory Has Been Paged Out

Jul 26, 2007

On a SQL Server 2005 x64 Standard Edition cluster I get the error listed below and then the SQL server service restarts. The SQL server is unavailable for 5-10 minutes during that time. Any ideas?

Error:
A significant part of sql server process memory has been paged out. This may result in a performance degradation. Duration: 647 seconds. Working set (KB): 11907776, committed (KB): 28731732, memory utilization: 41%%.

View 9 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved