I have installed sql server 2005 in cluster. after all the process is finished the memory usage is showing 2 gb where as in the activity monitor there is no process running. what should i do to clear the memory automatically.
If i stop and start the serverice it is cleared, nor do i want to run any command to clear the memory.
Ok I am faced with working with XML on a regular basis, which is fine.
DECLARE @ViewSN INT IF NOT EXISTS (select null from tblviews where viewcode = 'loadAtTerm') --<workflowEventType>loadAtTerminal</workflowEventType> insert into tblviews (ViewName,Description,OutBoundForm,StoredProcSN,TriggersReply,ViewCode,DispXactLayer,DispXactViewType,DispXfcTag,Comments) select 'QC:WF-LoadAtTerminal','This View Corresponds to the XML for loadAtTerminal in Omnitracs Workflow','0',NULL,'0', 'loadAtTerm','MCOM','MCOM',NULL,NULL
[code]...
What would be really useful is to be able to present any xml file and automatically parse the NODE names into a memory variable table and then the fields of each node in another.
I did a load testing and found the following observations:
1. The Memory:Pages/sec was crossing the limit beyond 20.
2. The Target Server Memory was always greater than Total Server Memory
Seeing the above data it seems to be memory pressure. But I found that AvailableMemory was always above 200 MB. Also Buffer Cache HitRatio was close to 99.99. What could be the reason for the above behavior?
begin transaction begin try insert into [EthnicRace] select * from [OEPRD].[CIS_Source_Data].[dbo].[EthnicRace]; insert into [MilestoneOwner] select * from [OEPRD].[CIS_Source_Data].[dbo].[MilestoneOwner]; insert into [PS_ACAD_CALTRM_TBL] select * from [OEPRD].[CIS_Source_Data].[dbo].[PS_ACAD_CALTRM_TBL]; insert into [PS_ACAD_DEGR] select * from [OEPRD].[CIS_Source_Data].[dbo].[PS_ACAD_DEGR]; end try
A job on our production server goes out to Oracle and gets raw data from our student system once per night. A job on our development server then uses the above coding to copy that raw data from production to the tables in our development system. My question is, will the log continue to grow until the "end try" is encountered or is it cleared at the end of each INSERT statement?
Using SQL Server 2000. When does SQL flush or clear the procedurecache? I am dynamically creating and dropping stored procedures (SP).Does SQL clear the cache for the SP that has been dropped? If not,when the SP is recreated (with the same name), does SQL use theexecution plan from cache?Thank you in advance.Jack
I have a report created in Reporting Services that contains 5 parameters. Two are text [date] fields, and three are drop-down multi-value parameters. One is driven from the result of another, and both of those work fine. My issue is with the final multi-value parameter, which is simply a list of customers. It gets populated from the data set without a problem, and I can select from 1 to all of the options. However, on submitting the report. This one parameter gets cleared out, the report does not run, but instead waits for me to select the customers (again) and resubmit.
Occasionally, I can select a few customers and it will, in fact, run the report. However, 90% of the time it just clears the field and waits.
I can trace the database, and everything looks normal - on submit nothing is sent to the database, so it's not like there's a problem with the query. I've verified the customer table and all records look similar - I don't see any obvious data issues.
maybe it's a silly question... I've created a new report by the wizard (including a data query). In the dataset frame my dataset contains the queried items. But when I change to data view and back to layout view in the designer frame, my dataset is cleared. Is this a bug or normal behaviour?
I have a very odd problem. I have a package which uses some custom tasks that were written in C#. When the package is deployed to our production server, *some* of the property values for *some* of the tasks are cleared. For example, I have these five tasks:
All of them inherit (of course) from Microsoft.SqlServer.Dts.Runtime.Task. All of them have custom members (some similar, some different), and of course, different implementation (though they are mostly the same). This test package has one instance of each of the different tasks.
As I said above, when we deploy to our production server, *some* of the property values for *some* of the tasks are cleared -- but when deployed to our dev server, everything remains intact.
Here is what is cleared:
- On 4 of the 5 tasks, the Description property (inherited by Task) is cleared, but the other one remained intact - On 3 of the 5 tasks, the Connection property (custom property in all tasks) is cleared, but the other two remained intact - 3 of the tasks have other string properties that were set, and all of these were cleared
We can reproduce this on two different production servers, and these two servers have some different configurations, suggesting these would not be the culprit:
- They have different service packs (one is build 2047, the other build 3042) - One has the custom SSIS components installed (in the GAC), the other one does not
Our development server, where the package is deployed as expected, has build 2047 w/ the components installed.
Here are the packages, where you can compare and see the differences (using a text comparison tool):
Dev-GOOD.xml Prod-BAD.dtsx
These were created after being deployed by importing within a Visual Studio SSIS project from the server.
Any suggestions would be *greatly* appreciated, as we are totally stumped as to why this is happening.
EDIT: Additional clues, this package is deployed to the MSDB. If it's deployed to the File System, it remains unmodified.
sql server 2000 is running on windows server 2003 ... 4gb of memory on server .... 2003 was allocated 2.3gb nd sql server was allocated (and using all of it) 1.6gb for total of approx 4gb based on idera monitor software ... all memory allocated betweeen the OS and sql server .... then 4 more gb of memory added for total now of 8g ... now idera monitor shows 1.7gb for OS and 1.0 gb for sql server ..... 'system' info shows 8gb memory with PAE ... so I assume that the full 8gb can now be addressed .... why are less resources being used now with more total memory .... especially sql server ..... i thought about specifying a minimum memmry for sql server but i amnot convinced that would even work since it seems that this 1gb limit is artificial .... it it used 1.6 gb before why would it not use at least that much now ??
I have a report that includes two multi-valued parameters. In the Default Values section, I choose 'from query' and select dataset and value field. In the Available Values section, I choose 'from query' select the same dataset and value field, and in the label field I select the relevant label field. When I run the report my multi-valued parameters look like I selected the option 'select all' (all options are selected). How can I keep the multi-valued parameters cleared from selections until the user select his choice? Thanks in advance.
@RemoteQuery consists of a SELECT four-table join, all tables are on the same linked server.
The Linked server has been set up on MyLocalServer using the "Microsoft OLE DB for SQL Server" provider. In the "Provider Options" for the linked server properties I checked "Non transacted updates" and "dynamic parameters". In the "Server Options" tab I have checked "RPC", "RPC Out", "Data Access".
The EXECUTE part of the query runs great (and returns the data very fast) by itself. But with the INSERT part, the query fails and returns the error:
"Server: Msg 7391, Level 16, State 1, Line 17 The operation could not be performed because the OLE DB provider 'SQLOLEDB' was unable to begin a distributed transaction. [OLE/DB provider returned message: New transaction cannot enlist in the specified transaction coordinator. ] OLE DB error trace [OLE/DB Provider 'SQLOLEDB' ITransactionJoin::JoinTransaction returned 0x8004d00a]."
The two servers are seperated by firewalls, so I believe the reason the query is failing is that I haven't followed the procedures for setting up the ports etc described in one of the microsoft support articles: e.g.: 250367.
Configuring the ports involves too much company politics, and besides, for what this query does, it does not need the benefits of a distributed transaction.
How can I execute my query without SQL Server automatically trying to upgrade it to a distributed transaction?
More Info: I can execute the query as a straight INSERT/SELECT linked-server query and it does the INSERT on the local SQL Server just like I want it to, so I assume it is not trying to use distributed transactions; but it takes around 7 seconds to run even though the entire SELECT is executed on the linked server, whereas executing with sp_executesql takes only 1 second.
I thought selected "non-transacted updates" in the provider would solve this problem, but it did not.
I'm looking for a way to get the name of the server on which the DTS package lives.
I copy packages between servers. The problem is that everytime a package is copied to different server, I have to change the reference in the connection strings to point to the new server name. I'd like to find an automatic way to interrogate the server name where the package currently lives and dynamically change connection strings from within an ActiveX task. That would cut maintenance way down.
I installed SQL Sp2 and at the same time took Carbon Copy off. No when the machine comes up neither SQL Server or the Agent will not come up. If I go to Service manager I can start both just fine. Nothing in either log to show why. Anyplace or any thoughts
I have SQL server running on windows Adv Server 2000. Since last couple days whole computer restart every one hours. Only thing I remember doing was to shrink database db size is 200+GB.
Hi all, I am wondering if there is a way that the SQL server resets every few hours. I am using an application which is heavily utilising SQL server and the server crashes every now and then. I can reset it manually but I was wondering if there is a way that it detects the crash and reset itself automatically. Thanks a million in advance.
1.prepare a SQLTask component, and read a table by it, return the result in an Variable Object(RegionDim). 2.Pass the Variable into a Taskscript component. In the Script, write this:
Public dtRegion As New DataTable Public daRegion As New OleDb.OleDbDataAdapter
Problem: After the TaskScript, the RegionDim is be cleared in fact. So I can't use it in other places, and need to read the table again.
I think the "daRegion.Fill(dtRegion, Dts.Variables("RegionDim").Value)" mark the RegionDim as the dtRegion, and the dtRegion is a variable scoped inside the Script, which will be distroyed after the Script. So the RegionDim is cleared.
I have a laptop with vista and sql 2005. When i start my computer, after it boots up and is ready to go, I look in the task manager at the process tab. Some sort of SQL Server component is running and eating up about 50k's worth of memory. What is it and how do set either Vista or SQL Server to not start whatever this is unless I do it manually(oh yeah...how would I turn it on?)and what are the ramifications of not having it start up when i start my laptop?
I know it is a rather rambling question, but its the best I can describe it. Thanks in advance for your help!
I been asked recently in the interview thatHow can I detect changes automatically in the SQL Server Database when anything is updated, deleted or inserted?if anyone can help me in this that will be really great. I dont actually know whether I can ask this here but I wanted to know the answer for this and thought this might be right place to ask? sorry if I am wrong? Thanks in Advance
A section of this company's intranet site where I just started interning at has little company anniversary and birthday sections that look like (for the anniversary section.. in the birthday section, it looks the same, except it doesn't say how old the comployee is):
-Steve Cunningham 6/1 - 6 yrs -Andrew Brown 6/3 - 11yrs -Lisa Stone 6/4 - 3 yrs
How can I get it so instead of manually changing that text every month, it will look at a SQL database and automatically change that text every month? I'm guessing the pseudocode would be if the b-day or anniv. month matches the current month, display the first and last name, the date, and number of years (which would have to be calculated maybe?) Any help would be GREAT! Thanks!!
I import MS Excel 2003 spread sheet in MS SQL Server 2000 through MS SQL Server 2000 Enterprise Manager (rightclick on the table to be filled with dataall taskimport data). My excel file have 2000 rows and 100 columns of data. All the data are imported in relevant attributes cells in good manner. But the rows are sorted automatically. I am trying to say that first row data is match with my excel file. But second row data have gone to 7th row and 7th row have gone to 5th row like that. I need the data sequence what I have in my excel file. What is the problem occurred? How can I solve this? Can I export my MS Excel 2003 file to MS SQL Server database? Please help me. I don't have more knowledge in MS SQL Server 2000. If your answer has any query to run then please mention where should I run that query. Thanks,
How to install sql server 2005 automatically with sa password
I want to know that i am creating a setup in visual studio 2005, i have set up sql server as prerequsite, by default the sql server installation is silent, it does not ask anything, in MSDE we had the facility of setyp.ini file which automatically creates the database with the settings provided in the ini file. Is there any option like this in sql server 2005 express editon.
I have developing sample application for automatically send failure mail to respective person. Also i have storing all the failure mail details to database for unavailability of mail server. Once if the mail server available that time we will send all the failure mail to respective person.
Hello. I have received the follwoing error upon an attempt to Browse the Cube. All other tabs are functional, including the Calculations tab. We are running Windows Server 2003 SP2 and SQL Server 2005 SP2. Any suggestions would be greatly appreciated!
**EDIT** - Have confirmed SP1 for VS2005 is installed both locally and on server, also.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (Microsoft Visual Studio)
------------------------------ Program Location:
at Microsoft.Office.Interop.Owc11.PivotView.get_FieldSets() at Microsoft.AnalysisServices.Controls.PivotTableFontAdjustor.TransformFonts(Font font) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdatePivotTable(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdateAll(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.InitialUpdate() at Microsoft.AnalysisServices.Browse.CubeBrowser.SupportFunctionWhichCanFail(FunctionWhichCanFail function)
I've been researching AWE to determine if we should enable this for our environment.
Currently we have a quad core box with 4 gb of RAM (VMware). OS: Windows 2003 std, SQL Server 2005 std. 3GB is not set but will be as soon as we can perform maintenance on the server.
I have read mixed feedback on AWE, either it works great or grinds you to a hault. I would assume that the grinding to a hault is due to not setting the min/max values correctly or not enabling the lock page in memory setting.
We only have one instance of SQL on the server and this box won't be used for anything else aside from hosting SQL services. We do plan on running SSRS off of this server as well.
1. Will running SSRS and enabling AWE cause me problems? Will I have to reduce the max setting by the SSRS memory usage or will it share and play nice?
2. How do I go about setting the Max value? Should it be less than the physical RAM in the box? Right now its set to the default of 214748364, even if I don't enable AWE should this default value be changed?
3. It seems that even at idle the SQL server holds a lot of memory and the page file grows. If I restart the process in the morning, memory usage in taskmon is at 600mb or so. By the end of the day, its up around 2gb. How can I track down whats causing this, should this even concern me?
4. The lock Page in memory setting worries me. Everything I've read on this seems to give a warning about serious OS and other program support degradation. In some cases to the point where they have to restore the settings on the server before they can bring it back up. What are your thoughts on this.
Can anyone direct me to a code that would automatically apply transaction logs to a DB in the standby server? We have a process that dumps the transaction log backup from the primary server into the backup server every hour on the hour, but I need to apply that transaction log as soon as it is in the standby server.
I am sure someone will ask, why not do transactional replication or log shipping? My answer to that is I have yet to learn how to setup replication between servers. I need to get our backup server up and running in the next few days.
This is the weirdest this I have ever seen in a long time. I have MS SQL Server running on a server and use Enterprise Manager a lot. Well, the damndest thing happens when I log onto the server from the console and run Enterprise Manager. If I go into Enterprise Manager, and go to a database and then select a table and right-click, and run the "Open Table" option; the entire Enterprise Manager application mysterously closes.
This only happens from the server console and through Remotely Anywhere...it does not happen when I log onto the server from Remote Desktop.
Has anyone ever seen this before? Does anyone know a fix for this?