SQL Server 2012 :: Processing Of XML With OPENROWSET Gives Out Of Memory Exception

Jun 20, 2014

I have an XML file about 25MB. When I read it with openrowset it gives me 'System.OutOfMemoryException'. The machine I'm running has 16GB of RAM and memory is definetely not exceeded. When I run smaller XML files it works fine.

I've read that older versions of SQL Server had this problem and it was caused by the parser having limited amount of memory. Is this still the case? Is there a way to change this?

View 7 Replies


ADVERTISEMENT

SQL Server Admin 2014 :: In-Memory Processing - Existing Tables

Nov 11, 2014

Is there a method of forcing existing tables into the in-memory filegroup so the table data can benefit from in-memory processing.

View 7 Replies View Related

SQL Server Admin 2014 :: In-Memory Previous Transaction Aborted Exception

Sep 21, 2015

I'm working on a large scale project that is currently in production. We have a big process that recently changed to use In-Memory Tables with SQL 2014 for performance efficiency.

The Process uses:

51 In-Memory SQL Tables.
50 Stored Procedures (not native) that loads data(Insert) from about 150 regular Tables and IM tables.
300 Validations (short stored procedure not native) Selecting from those 50 In-Memory Tables (And insert to In-Memory table that save the validation errors if exists on In-Memory table).

At the end of this process we clean the table from the data that relavnt to etch prosses(DELETE FROM WHERE)

B.T.W
No UPDATE STAT on In-Memory are used-when we test the prosses it slow as down and cause some locks.

We are calling this process from ADO.Net, loads stored procedure first and then validations, each SP use different SQL Connection. In normal use, everything works fine and takes about 1.5 second.

Under stress test (6 Clients X 100 Tasks) for 30 minutes. After several minutes we are starting to get this SQL Exception (1 SQL Exception for every 20 tasks):

41301. A previous transaction that the current transaction took a dependency on has aborted, and the current transaction can no longer commit.

Transactions in Memory-Optimized Tables

The Exception is not clear. We are not using BEGIN TRANSACTION in the process. The SQL Exception occurs in different stored procedures each time.

View 2 Replies View Related

SQL Server 2012 :: How To Do OPENROWSET Query Within Cursor Using Variables

Oct 22, 2015

I am writing a custom query to determine if a legacy table exists or not. From My CMS Server I already have all the instances I have to query and I store the name of the instance in the @Instance variable. I cannot get those stubborn ticks to work right in my query. Below I am using the IF EXISTS statement to search the metadata for the legacy table.

DECLARE @Found tinyint
DECLARE @Instance varchar(100)
set @Instance = 'The Instance'
IF (EXISTS (SELECT a.*
FROM OPENROWSET('SQLNCLI', 'Server=' + @Instance + ';UID=DBAReader;PWD=DBAReader;','SELECT * FROM [DBA].INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = ''TheTable''') AS a))
SET @Found = 1
ELSE
SET @Found = 0

PRINT @Found

View 2 Replies View Related

SQL Server 2012 :: Loading Image From File System Into Query Using Openrowset

May 14, 2014

I have a directory with images and a table in my DB with the path of each file. The main application allow me to create reports where I can display an image, so I was thinking to use a query like:

SELECT [ID]
,[PT_CODE]
,[FILE_PATH],
CASE WHEN [FILE_PATH] IS NOT NULL THEN
(SELECT * FROM OPENROWSET(BULK [FILE_PATH], SINGLE_BLOB) TT)
END
AS IMAGE_LOADED
FROM [DB].[dbo].[TABLE_MR_FILES]But I keep getting the error:

Incorrect syntax near 'FILE_PATH'.I have try multiple combinations without luck to make the OPENROWSET read the path stored in the column [FILE_PATH]. What am I missing?

Note: I am using MSSQL 2012. I don't want to import the images into the DB just load them in the fly as needed by the report runned from the application. I have full access to the DB so if a store procedure is the solution I can go with it.

View 4 Replies View Related

SQL Server 2012 :: Delete Last Job In Parallel Processing?

Mar 18, 2015

I create a main program which will launch two jobs at a time, each job does some processes and at the end I'm trying to delete those jobs after storing the job details in one of the custom table I created (cleanup sub-program).

Out of two jobs I am able to store one job details (like job_name,job_id,start_time and end_time of the job) in the custom table and able to delete that job, but the job that's getting completed at the end is not getting captured nor getting deleted from sysjobs and sysjobhistory tables.

I had included this step (which will call the cleanup sub-program to store the job details and delete it) at the end. I can see that this cleanup procedure getting called from debug message but it is neither storing details nor deleting the job.

When I execute this cleanup program separately, it does store the job details and delete it.

View 0 Replies View Related

SSIS Package Out Of Memory Exception

Aug 23, 2007

I have an SSIS Package that loads data from a log file. Prior to loading the data I need to prepare the file. I run a script that cleans the file. Then I import the flat file into SQL Server.

Log File Management Task
1. Run Unix Log File Task
2. Import the new log file (flat file) into SQL Server

Error
i.Unix.dtsx
Message: The script threw an exception: Exception of type 'System.OutOfMemoryException' was thrown.

Is this because the system is running out of memory? The RAM on the server is 4gb. Below is a sample of the script. The job doesn't always fail; there are times when the job executes with success and other times when it fails.

Script Source Code
-----------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports System.Collections.Generic
Imports System.IO
Imports System.Text
Imports System.Diagnostics
Imports System.Globalization
Imports Microsoft.VisualBasic
Imports System.Text.RegularExpressions
Public Class ScriptMain
'********** Begin Error Log Settings **********
'Dim sSource As String = "i.SSIS.Unix.FileManager"
'Dim sLog As String = "Application"
'Dim sMachine As String = "."
'Dim ELog As New EventLog(sLog, sMachine, sSource)
'********** End Error Log Settings **********

Public Sub Main()
'variables for the unix log file
Dim newFile As String = "D:iLogunixlog.txt"
Dim copyFile As String = "\server16iLogunixlog.txt"
'variables for working log files
Dim oldFile As String = "D:i empunixlog.txt"
Dim difFile As String = "D:i empunixdiff.txt"
Dim trimdiff As String = "D:i empunixdifft.txt"
Dim formatTemp As String = "D:i empunixlog_formatted.txt"
Dim errorFile As String = "D:i empunixlog_bad.txt"

'delete unixlog.txt copy unixlog.txt
'if the file is on the local server delete it and copy the new file over
'if the file is not present copy the new file over
Try
If File.Exists(newFile) Then
File.Delete(newFile)
File.Copy(copyFile, newFile)
Else
File.Copy(copyFile, newFile)
End If
While Not File.Exists(newFile)
System.Threading.Thread.Sleep(1000)
End While
'ELog.WriteEntry("FileCopy.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("FileCopy.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try

'open the old file; read backwards until we reach the carriage
'return and store that "seek" position; now open the new file and
'seek to that stored position. finally, read the rest of the file
'and write that data to the difference file.
' determine position of last line in the old file
Dim lastLine As Long = GetLastLinePosition(oldFile)
' get all data in new file starting at position determined above
Dim fi As New FileInfo(newFile)
Dim buffer(fi.Length - lastLine) As Byte
Dim fs As New FileStream(newFile, FileMode.Open)
Try
fs.Seek(lastLine, SeekOrigin.Begin)
fs.Read(buffer, 0, buffer.Length)
fs.Close()
' write that new data to the difference file
fs = New FileStream(difFile, FileMode.OpenOrCreate, FileAccess.Write, FileShare.None)
fs.Write(buffer, 0, buffer.Length)
fs.Close()
'ELog.WriteEntry("FileCopy.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("FileCopy.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try

'remove the partial row from the difference file
Try
TrimFinal(difFile, trimdiff)
'ELog.WriteEntry("TrimFinal.Call.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("TrimFinal.Call.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try
'perform the file formatting
sFormatFile(trimdiff, formatTemp, errorFile)
'
Dts.TaskResult = Dts.Results.Success
End Sub

Function GetLastLinePosition(ByVal fileName As String) As Long
Dim pos As Long = -1
Dim fs As New FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite)
Try
fs.Seek(-2, SeekOrigin.End) ' -2 to skip a potential vbcrlf at the end of file
While fs.Position > 0
fs.Seek(-1, SeekOrigin.Current)
If fs.ReadByte = 10 Then
pos = fs.Position
Exit While
Else
fs.Seek(-1, SeekOrigin.Current)
End If
End While
fs.Close()
'ELog.WriteEntry("GetLastLinePosition.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("GetLastLinePosition.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try
Return pos
End Function

Sub TrimFinal(ByVal difFile As String, ByVal trimdiff As String)
Dim fi2 As New FileStream(difFile, FileMode.OpenOrCreate, FileAccess.Read)
Dim fo2 As New FileStream(trimdiff, FileMode.OpenOrCreate, FileAccess.Write)
Dim sr2 As New StreamReader(fi2)
Dim sw2 As New StreamWriter(fo2)
Dim line2 As String
Try
Do While sr2.Peek <> -1
line2 = sr2.ReadLine()
If (sr2.Peek <> -1) Then
sw2.WriteLine(line2)
End If
Loop
sw2.Flush() : sw2.Close()
sr2.Close()
fi2.Close() : fo2.Close()
'ELog.WriteEntry("TrimFinal.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("TrimFinal.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try
End Sub

Sub sFormatFile(ByVal currentFile As String, ByVal tempFile As String, ByVal errorFile As String)
Dim tfp As New Microsoft.VisualBasic.FileIO.TextFieldParser(currentFile)
Dim sw As New System.IO.StreamWriter(tempFile)
Dim swErrorFile As New System.IO.StreamWriter(errorFile)
tfp.TextFieldType = FileIO.FieldType.Delimited
tfp.SetDelimiters(",")
tfp.HasFieldsEnclosedInQuotes = True
tfp.TrimWhiteSpace = True
Dim fields() As String
Try
While Not tfp.EndOfData
Try
fields = tfp.ReadFields()
If fields.Length <> 23 Then
'write bad rows to error-file
swErrorFile.WriteLine(String.Join(",", fields))
Else
If fields(3) = "" And fields(13) = "" Then
sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34))
ElseIf IsDate(fields(3)) = True OrElse fields(3) = Format(CDate(fields(3)), "yyyy-MM-dd HH:mms") And fields(13) = "" Then
sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34))
ElseIf IsDate(fields(13)) = True OrElse fields(13) = Format(CDate(fields(13)), "yyyy-MM-dd HH:mms") And fields(3) = "" Then
sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34))
ElseIf IsDate(fields(3)) = True OrElse fields(3) = Format(CDate(fields(3)), "yyyy-MM-dd HH:mms") _
And IsDate(fields(13)) = True OrElse fields(13) = Format(CDate(fields(13)), "yyyy-MM-dd HH:mms") Then
sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34))
Else
swErrorFile.WriteLine(String.Join(",", fields))
End If
End If
Catch ex As Exception
'ELog.WriteEntry("sFormatFile.TFP.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 0, CType(4, Short))
End Try
End While
tfp.Close()
sw.Close()
swErrorFile.Close()
File.Delete(currentFile)
File.Move(tempFile, currentFile)
'ELog.WriteEntry("sFormatFile.Success".ToString(), EventLogEntryType.SuccessAudit, 0, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("sFormatFile.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 0, CType(4, Short))
Finally
GC.Collect()
End Try
End Sub
End Class
-------------------------

Does my script seem okay for releasing the server memory usage?

Thanks.

View 1 Replies View Related

In-Memory Processing Tables - Limitations?

Nov 11, 2014

Just been reading up on this and there are a lot of limitation which one needs to consider when designing an in-memory table. I'm going to post them below in the hope that others will add to them so I can get a definitive list;

No foreign key constraints

No clustered indexes

No schema changes once the table is set in memory

No index changes once the table is set in memory

Alter Table function is not supported

Additional filegroup needs to be created in order to process in-memory tables

Varchar MAX is not supported

XML/User Defined data types are not supported

Max page length is 8060 (page overflow not supported)

Can't create indexes on NULLABLE columns

View 2 Replies View Related

Scheduled Job Fails With &#39;EXCEPTION - Insufficient Memory&#39;

May 31, 2001

A series of export/import jobs are scheduled on a dozen databases sitting on one of our servers, and are run at regular intervals through the day. Some of the jobs are failing with the following error recorded in the 'View Job History..':

EXCEPTION: Insufficient memory for this operation. Process Exit Code 2. The step failed.

Will this be cured by increasing the memory available to SQL Server (it has 512Mb already, 1/2 of the total physical RAM)? Also, why are only some jobs failing and others completing? Should I run performance monitor when the next schedule is?

Thanks

Derek

View 1 Replies View Related

Out Of Memory Exception When Running A Package With XML Task

Jun 1, 2007

Hi..



I'm running a package that has a XML Task in the control flow. This task tranforms a XML file with a XSLT.



The file is about 2 megs on a daily basis, but at the end of the month there is a full dump of data that makes the file to be around 400 megs. There is where my problem is.



I run this on my 2 GB memory workstation and when the memory gauge on the task manager reaches about 1.5gb the package fails with an "Out of memory exception".



I also run this package on a 8GB Ram server, and same applies.



Is there any way of making this package utilize all the available memory, I even increased the virtual memory to see if that helped my issue, but nothing.



Thanks

View 10 Replies View Related

TableDiff Out Of Memory Exception On Large Tables.

Sep 20, 2007

Hello,
I hope I am posting this in the right forum.

I am using tableDiff.exe to create a diff SQL script for a very large table (~4 million rows).


After a few minutes, I recieve a "System.OutOfMemoryException".

I have 4GB of ram on the machine executing the table diff.
The server is 32-bit, so adding ram is not an option.

I am executing the following command line:





Code Snippet

TableDiff.exe" -sourceserver "SERVER" -sourcedatabase "SourceDB" -sourcetable "Table1" -destinationserver "SERVER" -destinationdatabase "DestDB" -destinationtable "Table1" -f "C:TableDiffsTable1"

I have seen reports of other users executing tableDiff against 2million row tables.

Is there anyway to buffer tableDiff, so that I do not run out of memory on the server?

Could anything else be causing this error?

Thanks,
Dave

View 3 Replies View Related

SQL 2012 :: Server Process Memory Has Been Paged Out

Apr 1, 2015

In my SQL Server Errorlog, I see the below error. The system has 8 GB of RAM with enough free RAM, something I can do to prevent this alert? (Note: I have no MIN/MAX memory set on this Instance)

A significant part of sql server process memory has been paged out. This may result in a performance degradation. Duration: 328 seconds. Working set (KB): 76896, committed (KB): 167628, memory utilization: 45%.

View 5 Replies View Related

DB Engine :: MAX And MIN Server Memory Settings For 2012?

Sep 8, 2015

We have a new set up on VM to run an application running 24*7 (migrated from SQL server 2008R2) with below configs:

1. OS- Windows server 2012 Standard 64 bit hosted on Virtual Machine

2. Memory 16 GB and Cores =4 with 2.4GhZ processor

3. SQL server 2012 SP2 , 64 bit Standard edition.

4. Total size of databases as of now 15 GB with biggest being 5 GB.

How should i go around in setting the MAX and MIN server memory settings. I have this set up for many of SQL 2005 and 2008R2 servers, but for 2012 i heard that things has slightly changed. 

How should i start analyzing and setting the right value of this MAX and MIN?

View 5 Replies View Related

SQL Server 2012 :: Report Never Stops Loading / Error - Out Of Memory

Oct 3, 2014

I have a report that uses three tables from a database. As long as I only use two of the tables, it runs fine. I need the data from the third table for a line chart. So of course there is a great deal of data in the third table. I have a where clause for start date and end date. Is there a way I could only search the third table after I know what I need from it? Or

View 6 Replies View Related

SQL Server 2012 :: How To Capture CPU And Memory Usage For A Stored Procedure

Jan 19, 2015

I have around 100 packages (all [packages run at same time) each package calls a stored Procedure once the Stored proc Execution is completed the package will write the log information into a log table, here how can i capture the CPU and Memory usage for execution of each stored proc.

View 2 Replies View Related

SQL 2012 :: Memory Usage Extremely High And Causes Server To Crash

Jun 11, 2015

I have been having issues with our SQL server for awhile now. It seems to run out of memory every few days and when I look at the memory dump, the MEMORYCLERK_SQLOPTIMIZER seem to take over memory and eventually cause the server to crash.

Here is the SQL verison we are using: Microsoft SQL Server 2012 (SP1) - 11.0.3460.0 (X64) Jul 22 2014 15:22:00 Copyright (c) Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 6.2 (Build 9200: ) (Hypervisor)..It is on a VM on Windows 2012 server. It has 20gb of RAM allocated to it and the MAX Server Memory is set to 16.5gb.

I have seen the MEMORYCLERK_SQLOPTIMIZER grow to about 11gb at the time of the server crash. Why that is happening? What is causing the memoryclerk_sqloptimizer to get so high? I have looked it up and it looks like it has to do with ad hoc requests, but is there something I can do to bring that memory down when it gets so high so that I can prevent a server crash?Do we just need to add more memory or is there a memory leak somewhere?

View 2 Replies View Related

SQL 2012 :: OpenRowSet Works Locally But Not Over Networked Drives?

Jul 10, 2013

getting SQL able to import files via ORS. So we got it to work on local drives, but not over networked drives. This is the code and error:

select * into sample.dbo.[eriktest] from
OPENDATASOURCE('Microsoft.ACE.OLEDB.12.0',
'Data Source=serverSample est.xlsx;
Extended Properties="Excel 12.0 XML;HDR=YES;IMEX=1"')...[Sheet1$]
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7303, Level 16, State 1, Line 1
Cannot initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)".

But going from the local C: or D: drives is fine. I also tried using xp_cmdshell (OPC) to map a drive, and then tried using ORS with the drive letter, and got the same error.

View 9 Replies View Related

SQL 2012 :: Configuring Memory Per Query Option And Index Create Memory Option

Feb 10, 2015

So I started a new job recently and have noticed a few strange configurations. Typically I would never mess with min memory per query option and index create memory option configuration because i just haven't seen any need to. My typical thought is that if it isn't broke... They have been modified on every single server in my environment.

From Books Online:
• This option is an advanced option and should be changed only by an experienced database administrator or certified SQL Server technician.
• The index create memory option is self-configuring and usually works without requiring adjustment. However, if you experience difficulties creating indexes, consider increasing the value of this option from its run value.

View 3 Replies View Related

SQL 2012 :: OPENROWSET To Read In Data From Excel 2010 File (XLSX)

Jun 12, 2015

I have a test server (TEST1) running SQL 2012 and Windows 2012R2. One of the developers wants to use OPENROWSET to read in data from an Excel 2010 file (an xlsx file).

I have loaded the Microsoft Drivers in "AccessDatabaseEngine_64.exe" and enabled the Ad Hoc Distributed Queries option in SQL.

This is the sample code we are working with:

SELECT
X.MEMBID
FROM OPENROWSET(
'MSDASQL',
'Driver={Microsoft Excel Driver (*.xls, *.xlsx, *.xlsm, *.xlsb)};DBQ=etworkserverFolder1Folder2MEMBIDs.xlsx',
'SELECT * FROM [Sheet1$]'
) AS X

I can run the sample query from my laptop with SSMS (I have admin rights) and I can also run it as SA from my laptop. So all is good, right?

But if I RDC into TEST1, I cannot run the query. I get this error:

OLE DB provider "MSDASQL" for linked server "(null)" returned message "[Microsoft][ODBC Excel Driver] Your network access was interrupted. To continue, close the database, and then open it again.".
OLE DB provider "MSDASQL" for linked server "(null)" returned message "[Microsoft][ODBC Excel Driver]General error Unable to open registry key Temporary (volatile) Ace DSN for process 0x574 Thread 0xb74 DBC 0x1d07f08 Excel'.".
OLE DB provider "MSDASQL" for linked server "(null)" returned message "[Microsoft][ODBC Excel Driver]General error Unable to open registry key Temporary (volatile) Ace DSN for process 0x574 Thread 0xb74 DBC 0x1d07f08 Excel'.".
Msg 7303, Level 16, State 1, Line 1
Cannot initialize the data source object of OLE DB provider "MSDASQL" for linked server "(null)".But wait! It gets better.

I can run the query as SA from TEST1.

And of course, the developer can't run it either.

And it works fine in the production server.

I'm thinking the basics are there but something isn't right in some permission somewhere.

View 9 Replies View Related

SQL 2012 :: Error Processing Cubes

Aug 19, 2014

I just finished migrating SQL 2005 - 2012 and I am having issues processing my cubes. I included the error message below with generic names, but I think it should be transparent. I seem to be having an issue with one of the logins, but which one. The error message points to "domain_nameserver_name$" with a $ at the end. I am assuming this is a SQL login? I have included the error message below:

<Batch xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Parallel>
<Process xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2" xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2"

[code]...

View 1 Replies View Related

'((System.Exception)($exception)).Message' Threw An Exception Of Type 'System.NotSupportedException'

Jan 16, 2008

Greetings everyone, I am attempting to build my first application using Microsofts Sql databases. It is a Windows Mobile application so I am using Sql Server Compact 3.5 with Visual Studio 2008 Beta 2. When I try and insert a new row into one of my tables, the app throws the error message shown in the title of this topic.
'((System.Exception)($exception)).Message' threw an exception of type 'System.NotSupportedException'



My table has 4 columns (i have since changed my FavoriteAccount datatype from bit to Integer)
http://i85.photobucket.com/albums/k71/Scionwest/table.jpg

Account type will either be "Checking" or "Savings" when a new row is added, the user will select what they want from a combo box.

Next is a snap shot of my startup form.
http://i85.photobucket.com/albums/k71/Scionwest/form.jpg



Where it says "Favorite Account: None" in the top panel, I am using a link label. When a user clicks "None" it will go to a account creation wizard, and set the first account as it's primary/favorite. As more accounts are added the user can select which will be his/her primary/favorite. For now I am just creating a sample account when the label is clicked in an attempt to get something working. Below is the code used.


private void lnkFavoriteAccount_Click(object sender, EventArgs e)

{

FinancesDataSet.BankAccountRow account = this.financesDataSet.BankAccount.NewBankAccountRow();

account.Name = "MyBank Checking Account";

account.AccountType = "Checking";

account.Balance = Convert.ToDecimal("15.03");

account.FavoriteAccount = 1;//datatype is an integer, I have changed it since I took the screenshot.

financesDataSet.BankAccount.Rows.Add(account);
//The next three lines where added while I was trying to get this to work.
//I don't know if I really need them or not, I receive the error regardless if these are here or not.



this.bankAccountTableAdapter1.Update(financesDataSet);

this.financesDataSet.AcceptChanges();

refreshDatabase();

}


the refreshDatabase() code is here:


private void refreshDatabase()

{

this.bankAccountTableAdapter1.Fill(this.financesDataSet.BankAccount);

//Aquire a count of accounts the user has

int numAccounts = financesDataSet.BankAccount.Count;

//Loop through each account and see which one is the primary.

for (int num = 0; num != numAccounts; num++)

{
//Works ok in frmMain_Load, but when my lnkFavoriteAccount_click calls this, it throws the error.

if (this.financesDataSet.BankAccount[num].FavoriteAccount == 1)

{
//Display the primary account on our home page. User can click the link label & be taken to their account register.

this.lnkFavoriteAccount.Text = this.financesDataSet.BankAccount[num].Name.ToString();

this.lnkFavoriteFunds.Text = this.financesDataSet.BankAccount[num].Balance.ToString();

break;

}

}

}


and my form_load code

private void frmMain_Load(object sender, EventArgs e)

{

refreshDatabase();

}


So, when I click on the lnkFavoriteAccount label, and my new row gets added, the app stops at the following line in my DataSet.Designer

[global:ystem.Diagnostics.DebuggerNonUserCodeAttribute()]

public byte FavoriteAccount {

get {

try {

return ((byte)(this[this.tableBankAccount.FavoriteAccountColumn]));

}

catch (global:ystem.InvalidCastException e) {
//Stops at the following line, this error was caused by 'if (this.financesDataSet.BankAccount[num].FavoriteAccount == 1)'

throw new global:ystem.Data.StrongTypingException("The value for column 'FavoriteAccount' in table 'BankAccount' is DBNull.", e);

}

}

set {

this[this.tableBankAccount.FavoriteAccountColumn] = value;

}

}


I have no idea what I am doing wrong, all of the code I used I retreived from Microsofts help documentation included with VS2008. I have tried used my TableAdapter.Insert() method and it still failed when it got to

if (this.financesDataSet.BankAccount[num].FavoriteAccount == 1)

in my refreshDatabase() method it still failed.

When I look, the data has been added into the database, it's just when I try to retreive it now, it bails on me. Am I retreiving the information wrong?

Thanks for any help you guys can offer.

Johnathon

View 1 Replies View Related

Analysis :: SSAS 2012 - Notification While Cube Is Processing

Apr 16, 2015

I am working on SQL 2012.We have a SSAS Cube build. On top of it client use Excel to connect to SSAS Cube and See the reports.My Cube process every hour and take almost a 1-2 min to Process.When ever End user/Client See or refresh the Excel report (Which use Cube as Source),WHEN CUBE IS PROCESSING ,they get an error that Source is not available

We Have tried best but cannot bring down the Processing time of the cube to 3-5 Second , so that End user don't face report refresh issue at the moment of cube processing 

Requirement :In Case End User see the report while Cube is processing from back end , Instead of Error they should see some customize Msg which we can provide some thing Like "Data is Refreshing , please wait ".

View 5 Replies View Related

SQL 2012 :: SSIS Processing Of SSAS Multidimensional Cube Fails?

Oct 31, 2012

All I get back is an error message of "Analysis Services Processing Task Error: A Connection cannot be made. Ensure the Server is running" The server is running, I can process the cube by connecting to the AS instance and right-click processing it.

I can process the cube by running the SSIS task inside of SSDT Just when I deploy the SSIS package (in Project mode) and then execute it do I get the error message.

SQL Server, SSAS, and SSIS processes are all running under the same account. SSAS is on a separate server from SSIS and SQL if that matters.

View 2 Replies View Related

SQL Server 2012 :: Query To Get CPU Usage / Memory Usage Details Of Server?

Jan 30, 2014

providing a query for fetching the data for CPU Usage, Memory usage, blocking and all details ...

I want to create a job which will run on a Node every 15 min and store data in a table for each instance...

DMV is not giving more stuff and xtended events not sure if i can store that data into a table?

View 7 Replies View Related

Performance Issues Total Server Memory Vs Target Server Memory

Aug 2, 2006

Hi

I did a load testing and found the following observations:

1. The Memory:Pages/sec was crossing the limit beyond 20.

2. The Target Server Memory was always greater than Total Server Memory

Seeing the above data it seems to be memory pressure. But I found that AvailableMemory was always above 200 MB. Also Buffer Cache HitRatio was close to 99.99. What could be the reason for the above behavior?

View 1 Replies View Related

The Script Threw An Exception: Exception Of Type 'System.OutOfMemoryException' Was Thrown.

Jan 31, 2007

Hi,

I got an strange problem with one of my packages.

When running the package in VisualStudio it runs properly, but if I let this package run as part of an SQL-Server Agent job, I got the message "The script threw an exception: Exception of type 'System.OutOfMemoryException' was thrown." on my log and the package ends up with an error.

Both times it is exactly the same package on the same server, so I don't know how the debug or even if there is anything I need to debug?

Regards,

Jan

View 2 Replies View Related

Sql Server 2000 Using Less Memory After More Memory Added

Aug 22, 2007

sql server 2000 is running on windows server 2003 ... 4gb of memory on server .... 2003 was allocated 2.3gb nd sql server was allocated (and using all of it) 1.6gb for total of approx 4gb based on idera monitor software ... all memory allocated betweeen the OS and sql server .... then 4 more gb of memory added for total now of 8g ... now idera monitor shows 1.7gb for OS and 1.0 gb for sql server ..... 'system' info shows 8gb memory with PAE ... so I assume that the full 8gb can now be addressed .... why are less resources being used now with more total memory .... especially sql server ..... i thought about specifying a minimum memmry for sql server but i amnot convinced that would even work since it seems that this 1gb limit is artificial .... it it used 1.6 gb before why would it not use at least that much now ??

thank you

View 4 Replies View Related

SQL 2012 :: Configuring Alert For Low Memory?

Oct 1, 2015

configuring sql alert for low sql memory.

I have configured the memory for sql server as below

min : 512 MB
Max :13500 MB

Total RAM on the server is 16GB

I want to receive email from sql server informing that sql server is running on low memory ( less than 200 MB free) so that I do not get out of memory issues and the sql server continues to run without fail.

View 9 Replies View Related

SQL 2012 :: Warning - Could Not Open Global Shared Memory

Jun 28, 2012

Apparently this error was fixed in CU12 for SQL 2008, but it seems to have raised it's head again in SQL 2012.[SSIS.Pipeline] Warning: Warning: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available. To resolve, run this package as an administrator, or on the system's console.

I've got a client who is seeing it. but I've not seen a fix in CU1 or CU2 for 2012.

View 4 Replies View Related

SQL 2012 :: Update Stats Full Scan And Memory

Apr 9, 2014

I noticed today a session that was executing a FULL SCAN update as follows:

UPDATE STATISTICS [XXXX].[XXXX].[XXXX] [_WA_Sys_00000009_318D45CA] WITH FULLSCAN

When I checked the sys.dm_exec_query_memory_grants DMV for the session I could see the following values:

requested_memory_kb granted_memory_kb used_memory_kb max_used_memory_kb
145,705,216 145,705,216 139,977,336 139,980,408

When I checked the Properties of the Statistic I can see it is on a varchar(3) field when there are only 3 different values in there - all char(1)

The total size of the data in the table according to the Disk Usage By Top Table Report is 199,680,712KB

So my question is this...

For the UPDATE STATS on this one column with FULL SCAN, does SQL Server read the entire table into the Buffer Pool. If so then if the table had 199,680,712KB of data then why did the session request 145,705,216KB.

Or does SQL Server just read the column and ClusteredIndex Key into the Buffer Pool?

View 1 Replies View Related

SQL 2012 :: Could Not Open Global Shared Memory To Communicate With Performance DLL

Apr 22, 2014

Getting the following warning in SSIS - SQL 2012:

[SSIS.Pipeline] Warning: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available. To resolve, run this package as an administrator, or on the system's console.

Microsoft had a fix for SQL Server 2008.

How to get around this in SQL 2012?

View 4 Replies View Related

SQL 2012 :: Buffer Cache Size Much Lower Than Max Memory Config And Low PLE?

May 22, 2014

I have a virtual server (VMware ESX) with 64GB RAM running a single instance of SQL 2012 SP1. The max memory config is set to 59392 (58GB).

The Page Life Expectancy for this server has been averaging well under 10 mins for the last few days, according to our monitoring.

I have been checking the amount of data in the buffer cache periodically during the day with the below query, which seems to show that there is never more than about 10GB of data at any one time, frequently dropping below 5GB:

SELECT COUNT(*) AS BufferPages,
CONVERT(decimal(10, 2), COUNT(*) / 128.0) AS BufferMB
FROM sys.dm_os_buffer_descriptorsWhy would the amount of cached data be so low (and cause so much churn)?

I am aware that other things will require some of that memory (plan cache etc.) but with Max Mem of 58GB, I would expect there to be a much higher amount of actual cached data at any one time. I did the same checks on another VM with the same amount of RAM/Max Mem setting, and there was 50GB of data in the cache, with PLE measured in hours.

View 9 Replies View Related

SQL 2012 :: Determining MAXDOP And Memory Configuration For Multiple Instances?

Sep 18, 2014

We are just finishing our migration to SQL 2012. In our old environment, the instance which held our SharePoint databases also served other applications. We did not experience any performance related issues in the past due to this.

SharePoint basically requires MAXDOP to be 1, which is correct on the old server. Since this configuration may not be ideal for other applications that may be put within our environment, we our entertaining the idea of isolating SharePoint into its own instance, probably on the same box.

My manager wants me to come up with performance trace data to better prove that we need to go this route since we apparently have had issues in the past by blindly following Microsoft's best practices.

1.MAXDOP configuration - I understand this may be a 2 pronged approach that would require looking at various execution plans and CPU related counters in Perfmon. SharePoint likely requires a maxdop of 1 due to the nature of the application (lots of concurrent processes). What is the best way to show this need graphically?

2. Memory configuration for multiple instances - Does the Total Server Memory reveal all the memory that a given SQL instance is utilizing? Should I use this counter to identify appropriate min/max memory configurations for multiple instances on a single cluster?

The problem with the perfmon approach is that it's scope is limited to just the server. Since our SharePoint environment is currently being shared with other applications, I understand that I may have to utilize DMV statistics to narrow down my analysis.

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved