I have a person who just recomended to a customer to use Fixed instead of Dynamic for the SQL Server Memory setting informing me that I was wrong for setting it to use Dynamic.
Well I tried to explain that why would you want to give away memory if you do not need to, he just answered that I was wrong, any help in proving me wrong or him wrong?
Which way is the correct way. I was told by a very bright person with a strong background in the developement of SQL that when it was written it was done so with the word "fixed" memory size was done in lower case just as a reminder that you can but should never use "fixed" that you should always use Dynamic (which by the way does use a capitol "D"...)
Just wondering...Thanks for the help guys...
This is the personal edition running on the XP Pro platform for use in a Point-of-Sale enviorment with 6 other computers connecting to hte SQL server on the one back of house computer which is running 4mb of memory.
I told them that the minimum should be at zero and the max is set to half....
Does anyone mess with this feature? they say that you don't have to change this setting in sql2000. I have 3 gigs of memory and have it still set to dynamically configure the memory. I don't know if i should change this to a fixed memory size, if i did how do you tell what value to set it at? What are the pros and cons if I set it to fixed? Thanks.
I have gotten mixed comments on this topic. I have a 64 bit machine running 64 windows 2003 standard and 64 SQL 2005 standard with 8 GB of RAM. We want to upgrade it to 32 GB. What is the best approach to do this? Dynamic or Stattic giving min and max server memory a value ? and if static what value should I use for 32 GB knowing that this box is only being used for SQL.
I have gotten mixed comments on this topic. I have a 64 bit machine running 64 windows 2003 standard and 64 SQL 2005 standard with 8 GB of RAM. We want to upgrade it to 32 GB. What is the best approach to do this? Dynamic or Stattic giving min and max server memory a value ? and if static what value should I use for 32 GB knowing that this box is only being used for SQL.
I did a load testing and found the following observations:
1. The Memory:Pages/sec was crossing the limit beyond 20.
2. The Target Server Memory was always greater than Total Server Memory
Seeing the above data it seems to be memory pressure. But I found that AvailableMemory was always above 200 MB. Also Buffer Cache HitRatio was close to 99.99. What could be the reason for the above behavior?
sql server 2000 is running on windows server 2003 ... 4gb of memory on server .... 2003 was allocated 2.3gb nd sql server was allocated (and using all of it) 1.6gb for total of approx 4gb based on idera monitor software ... all memory allocated betweeen the OS and sql server .... then 4 more gb of memory added for total now of 8g ... now idera monitor shows 1.7gb for OS and 1.0 gb for sql server ..... 'system' info shows 8gb memory with PAE ... so I assume that the full 8gb can now be addressed .... why are less resources being used now with more total memory .... especially sql server ..... i thought about specifying a minimum memmry for sql server but i amnot convinced that would even work since it seems that this 1gb limit is artificial .... it it used 1.6 gb before why would it not use at least that much now ??
Hi, i'm still rather new to sql server and sql server express. My problem is that I have 50 fixed width files(actually was LIS file type, converted to txt using notepad). Each text file has about 50 columns of data. What would be the best way to get this into sql server db?
On Analysis Services 2005 the members of the Administrators local group are also members of the fixed server role, therefore they have full control over Analysis Services databases.
I think this can be a problem becouse many system administrators don't need full control over AS. Does someone kwon how can I remove those high privileges to the local administrator?
Hi there. I work in a support department and on great occasion (such as this morning), I am RASd in to a client and try running a SQL trace, only to receive an error when setting it up, 'In order to run a trace against SQL Server you have to be amember of sysadmin fixed server role.'
Today, I even called their DBA and asked him if he could set our userid up with the proper permissions to all us to run traces (I'm debugging a RTE). He stated that he was unfamiliar with the error and didn't know where to assign us to resolve this problem.
I wasn't sure where to put this topic so I put it here since I figured it is a question that would apply to virtually any version even though I am using SQL Server 2005.
We have a vendor that sends us a fixed width text file every day that needs to be imported to our database in 3 different tables. I am trying to import all of the data to a staging table and then plan on merging/inserting select data from the staging table to the 3 tables. The file has 77 columns of data and 20,000+ records. I created an XML format file which I sampled below:
The data file is a fixed width file with no column delimiters or row delimiters that I can tell. When I run the following insert statement I get the error below it.
BULK INSERT myStagingTable FROM '.........myDataSource.txt' WITH ( FORMATFILE = '.........myFormatFile.xml', ERRORFILE = '.........errorlog.log' );
Here is the error:
Msg 4832, Level 16, State 1, Line 1 Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1 Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
what is the best way to import fixed length text file to sql server using SSIS?
I was trying to using text file source and ole db destination..but since the text file has no columns and have different length per column and per line( it show only one column becasue it all concatnated), I can not map it to destination column..
How can I import it?
Here is the example of text file ( fixed with row delimeter)that i need to import to different columns...
I have this doubt and want to be sure if my thinking is correct.
Lets consider 2 tables one with Fixed length columns (char) and other table with Variable length columns (Varchar).
The table with fixed length column will always allocate same size within a Page however, table with variable length column will allocate actual length of data within a page.
I think that updates happening on table with fixed length columns will have more possibility of InPlace updates at least from data length perspective, however updates on table with variable length columns will have more split updates from data length perspective.
Edition: SQL Server 2005 Standard I am trying to take a snapshot of a database for use in a publication. The account under which the snapshot agent is running is set to have the db_owner role for the database and have write access to the snapshot share.
I can not get the snapshot to run unless the account under which the snapshot agent is running is granted the sysadmin fixed server role. Because of the security implications of this, I don't want to grant these permissions.
As far as I am concerned, the minimum requirements for the snapshot account have been met and I have tried every other alternate that I can think of. I've checked MSDN and the newsgroups but I still have not solved the problem.
The error that I get when I run the snapshot.exe from the command line is: The remote server "TURING" does not exist, or has not been designated as a valid Publisher, or you may not have permission to see available Publishers.
This error message has now inexplicably changed to: You do not have sufficient permissions to run the command...
I'M HAVING AN ISSUE, UNDERSTANDING, THE CONNECTION STRING. I WANT TO CONNECT TO AN INSTANCE OF SQLEXPRESS ON A REMOTE SERVER WITH A FIXED IP ADDRESS THE TCP PORT IS OPEN TO 1433 I OPENED THE PORT ON THE ROUTER AND THE WINDOWS 2003 FIREWALL
MY CODE IS AS FOLLOWS: S = "Provider=SQLNCLI;" S = S & "DATA SOURCE=44.66.777.888,1433SQLEXPRESS;" S = S & "INITIAL CATALOG=TESTDB;" S = S & "Persist Security Info=false;" S = S & "UID=TEST999;" S = S & "Pwd="TEST999888;" CnMgt.ConnectionString = S CnMgt.Open S
I'VE FOLLOWED THE STEPS OUTLINED IN ARTICLE 914277
CAN SOMEONE HELP? THANK YOU
PS. IF THE CLIENT IS LOCAL, I HAVE NO ISSUE OPENING THE DATABASE. I DO NEED TO OPEN THE DB FROM FROM CLIENTS.
We would like to use the bulk insert function to import large CSV files into a SSE database however we have serious concerns regarding giving all our users these high privleges. Is there some way around this can we give them the privleges temporarily do the insert and take it away again or some other solution.
Hello. I have received the follwoing error upon an attempt to Browse the Cube. All other tabs are functional, including the Calculations tab. We are running Windows Server 2003 SP2 and SQL Server 2005 SP2. Any suggestions would be greatly appreciated!
**EDIT** - Have confirmed SP1 for VS2005 is installed both locally and on server, also.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (Microsoft Visual Studio)
------------------------------ Program Location:
at Microsoft.Office.Interop.Owc11.PivotView.get_FieldSets() at Microsoft.AnalysisServices.Controls.PivotTableFontAdjustor.TransformFonts(Font font) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdatePivotTable(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdateAll(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.InitialUpdate() at Microsoft.AnalysisServices.Browse.CubeBrowser.SupportFunctionWhichCanFail(FunctionWhichCanFail function)
I've been researching AWE to determine if we should enable this for our environment.
Currently we have a quad core box with 4 gb of RAM (VMware). OS: Windows 2003 std, SQL Server 2005 std. 3GB is not set but will be as soon as we can perform maintenance on the server.
I have read mixed feedback on AWE, either it works great or grinds you to a hault. I would assume that the grinding to a hault is due to not setting the min/max values correctly or not enabling the lock page in memory setting.
We only have one instance of SQL on the server and this box won't be used for anything else aside from hosting SQL services. We do plan on running SSRS off of this server as well.
1. Will running SSRS and enabling AWE cause me problems? Will I have to reduce the max setting by the SSRS memory usage or will it share and play nice?
2. How do I go about setting the Max value? Should it be less than the physical RAM in the box? Right now its set to the default of 214748364, even if I don't enable AWE should this default value be changed?
3. It seems that even at idle the SQL server holds a lot of memory and the page file grows. If I restart the process in the morning, memory usage in taskmon is at 600mb or so. By the end of the day, its up around 2gb. How can I track down whats causing this, should this even concern me?
4. The lock Page in memory setting worries me. Everything I've read on this seems to give a warning about serious OS and other program support degradation. In some cases to the point where they have to restore the settings on the server before they can bring it back up. What are your thoughts on this.
I have a Windows sever 2012 with sql server 2012 enterprise. Ram size is 22GB. Sometimes SQL sever takes 95% memory.My question, How to reduce memory size without killing any process because it's production server.So there are many background process is running. And,Is there any guides to learn why Memory is raise d so high and how to reduce it.
Hello, I understand that we should use SSMS -> Server Properties -> Memory to put a cap on the SQL server memory usage, therefore it gives some space memory for OS, this is based on the fact if the max memory is not specified, SQL will use whatever available memory and eventually crash the system.
My question is that when a server has SSIS and SSAS services installed along with the SQL service. Would the max memory setting covers the SSIS and SSAS memory usage, or the SSIS and SSAS has to shared the memory with OS?
I am running Visual Studio 2005. I have an SSIS Package which is consuming a huge amount of memory. During the execution of the package the memory keeps increasing. Until finally i get an Out of Memory exception. I have run this package using dtexec, and in the BIDS. No difference. I do have some script components and have added some code to get the assemblies in the current appdomain. I do see that one particular assembly is increasing on every loop. VBAssembly every time it hits the script component is increasing by 6, and along with it the memory is climbing. What is this VBAssembly being used for is there an update to SQL Server Integration Services that I need?
I've a database with a memory optimized filegroup on it. How can I remove it?I have removed the memory optimized table I had on it, but when I try to remove the filegroup I receive an error.
My server is a dual AMD x64 2.19 GHz with 8 GB RAM running under Windows Server 2003 Enterprise Edition with service pack 1 installed. We have SQL 2000 32-bit Enterprise installed in the default instance. AWE is enabled using Dynamically configured SQL Server memory with 6215 MB minimum memory and 6656 maximum memory settings.
I have now installed, side-by-side, SQL Server 2005 Enterprise Edition in a separate named instance. Everything is running fine but I believe SQL Server2005 could run faster and need to ensure I am giving it plenty of resources. I realize AWE is not needed with SQL Server 2005 and I have seen suggestions to grant the SQL Server account the 'lock pages in memory' rights. This box only runs the SQL 2000 and SQL 2005 server databases and I would like to ensure, if possible, that each is splitting the available memory equally, at least until we can retire SQL Server 2000 next year. Any suggestions?
I am receiving the following error when starting a program called ShelbySystems that is supposed to connect to a local database. I don't think this is a security issue but I don't know much about SQL server either so...
DIAG [08001] [Microsoft][ODBC SQL Server Driver][Shared Memory]SQL Server does not exist or access denied. (17) DIAG [01000] [Microsoft][ODBC SQL Server Driver][Shared Memory]ConnectionOpen (Connect()). (2)
System Info: Windows 10 Home - upgrade from 8 64 bit SQL server 2012 Express SQL Backwards compatibility 2005 64 bit ShelbySystems software v5.4
I am including the trace log in case it is useful.
So I started a new job recently and have noticed a few strange configurations. Typically I would never mess with min memory per query option and index create memory option configuration because i just haven't seen any need to. My typical thought is that if it isn't broke... They have been modified on every single server in my environment.
From Books Online: • This option is an advanced option and should be changed only by an experienced database administrator or certified SQL Server technician. • The index create memory option is self-configuring and usually works without requiring adjustment. However, if you experience difficulties creating indexes, consider increasing the value of this option from its run value.
Dear All, I have hosted ASP.Net site with Sql server 2005 I have also configured State server for my site and started ASPNET State Server I have gone one problem hat is SQL serve memory varies form 800MB to 1GB I have been closing all the db objects in my code also installed SQL serve SP 2 still the problem exist Can any one help
Hi all, I am using StringBuilder to build a huge insert string which clubs around 12,000 insert strings into a single string. But whle executing this string the SQL server does no insert and the profiler shows that SQL server was out of memory. Any ideas? Is there a limit to the maximum number of inserts I can do?