Anyone seen this SSIS error when importing data? I have a 64bit quad processor with 8gb and am importing from Oracle 9 using 32bit DTExec.exe from the command line.
OnInformation,Myserver,MyDomainSQLAdmin,J001OracleDimExtract,{CEB7F874-7488-4DB2-87B9-28FC26E1EF9F},{1221B6EB-D90A-466E-9444-BA05DBC6AFD8},6/29/2006 10:58:08 AM,6/29/2006 10:58:08 AM,1074036748,0x,The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 2 buffers were considered and 2 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.
Details:MS SQL 2000 dual Intel 1.2 GHz processors.1 GB RAM2.1 GB dBDynamic Memory Managment.No other apps running on this server.First question:Since I have Dynamic Memory Managment setup, Is it usuall that thesqlsrv.exe process on the server steadily climbs and is in the 800 to900 MB range. There is only about 20 MB free. In theory this is howDMM can work, but do people really see it work this way.Second question:I had users complaining about lockups in the app I have to supportthat connects to this dB. At first I thought it was the large use ofmemory, but once I was able to see in Enterprise Manager that therewas process blocking several other processes. EM then locked up and Icouldn't get to the details of what the exact process was that wasdoing the blocking. After restarting SQL services things were fine.When I checked the logs there was nothing there about a hung process.The logs seemed very sparse. Why would there not be anything in thelogs about it. The logs actually seem very thin on any information.Thanks,T.
I have had a full lock on my sql server and I have a few logs to found the origin of the lock.
I know the process at the head of the lock is the 55 process.
Here are the information I have on this process: Spid 55 55 ecid 5 5 Ecid 0 0 ObjId 0 1784601646 IndId 0 0 Type DB PAG Resource 1:1976242 Mode S IS Status TransID GRANT GRANT TransID 0 16980 TransUOW 00000000-0000-0000-0000-000000000000 00000000-0000-0000-0000-000000000000
lastwaittype PAGEIOLATCH_SH CMD AWAITING COMMAND Physycal id 1059 Login time 2007-07-05 04:29:53.873 nat address DFF06EBF974D Wait type 0x0046 HostName . BlkBy . DBName grpprddb CPUTime 54331 DiskIO 1059 ProgramName
Would someone know a way to identify the origin of the process 55?
I have already tried to execute the following request: select * from SYSOBJECTS where id=1784601646
A long time ago I posted this: http://blogs.conchango.com/jamiethomson/archive/2005/06/09/1583.aspx explaining all the different buffer types in the pipeline.
I have to admit I'm still not clear on the difference between a private buffer and a flat buffer though.
Are flat buffers a subset of private buffers. If so, if a private buffer is not a flat buffer - what is it? If not, can a buffer be a private buffer AND a flat buffer?
Some descriptions from BOL are:
Private buffers: A private buffer is a buffer that a transformation uses for temporary work only Flat buffers: Flat buffers are blocks of memory that a component uses to store data
That sounds like two ways of saying the same thing to me! It certainly doesn't distinguish them anyway!
Just seeking some clarification here. If you could whip of a demo package that explains the difference between the two (with reference to the Performance Counters) then that would be great.
Hi, I've started 2 dbcc processes using SQL Scheduler overnight. and I've got one is flagged rollback and the other is still running, they are blocked together. I've tried to kill them using kill <spid> without success. How can I delete this processes ? Shall I shutdown the server ?
I am writing a package to process perfmon logs. The issue I have come across is that the perfmon process holds onto the log file and SSIS fails because it wants to exclusive read access. Is there any way of getting SSIS to not take an exclusive read on the file.
i'm going nuts with SQL server notification thing. I have gone throigh this artical which tells how to set user http://www.codeproject.com/KB/database/SqlDependencyPermissions.aspx. This article show how to create new user and setup for sql server notification.But In my case user was alredy existing in database. which is very common senario in most cases. So i did following( check the SQL script below) but then i get this error "A connection was successfully established with the server, but then an error occurred during the login process. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)" this my sql script use [master]Go -- Ensuring that Service Broker is enabled ALTER DATABASE [DatabaseName] SET ENABLE_BROKERGO -- Switching to our databaseuse [DatabaseName]GO CREATE SCHEMA schemaname AUTHORIZATION usernameGO ALTER USER username WITH DEFAULT_SCHEMA = schemaname GO /* * Creating two new roles. We're not going to set the necessary permissions * on the user-accounts, but we're going to set them on these two new roles. * At the end of this script, we're simply going to make our two users * members of these roles. */EXEC sp_addrole 'sql_dependency_subscriber' EXEC sp_addrole 'sql_dependency_starter' -- Permissions needed for [sql_dependency_starter]GRANT CREATE PROCEDURE to [sql_dependency_starter] GRANT CREATE QUEUE to [sql_dependency_starter]GRANT CREATE SERVICE to [sql_dependency_starter]GRANT REFERENCES on CONTRACT::[http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification] to [sql_dependency_starter] GRANT VIEW DEFINITION TO [sql_dependency_starter] -- Permissions needed for [sql_dependency_subscriber] GRANT SELECT to [sql_dependency_subscriber] GRANT SUBSCRIBE QUERY NOTIFICATIONS TO [sql_dependency_subscriber] GRANT RECEIVE ON QueryNotificationErrorsQueue TO [sql_dependency_subscriber] GRANT REFERENCES on CONTRACT::[http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification] to [sql_dependency_subscriber] -- Making sure that my users are member of the correct role.EXEC sp_addrolemember 'sql_dependency_starter', 'username'EXEC sp_addrolemember 'sql_dependency_subscriber', 'username'
Hello all, I am running into an interesting scenario on my desktop. I'm running developer edition on Windows XP Professional (9.00.3042.00 SP2 Developer Edition). OS is autopatched via corporate policy and I saw some patches go in last week. This machine is also a hand-me-down so I don't have a clean install of the databases on the machine but I am local admin.
So, starting last week after a forced remote reboot (also a policy) I noticed a few of the databases didn't start back up. I chalked it up to the hard shutdown and went along my merry way. Friday however I know I shut my machine down nicely and this morning when I booted up, I was in the same state I was last Wenesday. 7 of the 18 databases on my machine came up with
FCB:pen: Operating system error 32(The process cannot access the file because it is being used by another process.) occurred while creating or opening file 'C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataTest.mdf'. Diagnose and correct the operating system error, and retry the operation. and it also logs FCB:pen failed: Could not open file C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataTest.mdf for file number 1. OS error: 32(The process cannot access the file because it is being used by another process.).
I've caught references to the auto close feature being a possible culprit, no dice as the databases in question are set to False. Recovery mode varies on the databases from Simple to Full. If I cycle the SQL Server service, whatever transient issue it was having with those files is gone. As much as I'd love to disable the virus scanner, network security would not be amused. The data and log files appear to have the same permissions as unaffected database files. Nothing's set to read only or archive as I've caught on other forums as possible gremlins. I have sufficient disk space and the databases are set for unrestricted growth.
Any thoughts on what I could look at? If it was everything coming up in RECOVERY_PENDING it's make more sense to me than a hit or miss type of thing I'm experiencing now.
Dear list Im designing a package that uses Microsofts preplog.exe to prepare web log files to be imported into SQL Server
What Im trying to do is convert this cmd that works into an execute process task D:SSIS ProcessPrepweblogProcessLoad>preplog ex.log > out.log the above dos cmd works 100%
However when I use the Execute Process Task I get this error [Execute Process Task] Error: In Executing "D:SSIS ProcessPrepweblogProcessLoadpreplog.exe" "" at "D:SSIS ProcessPrepweblogProcessLoad", The process exit code was "-1" while the expected was "0".
There are two package varaibles User::gsPreplogInput = ex.log User::gsPreplogOutput = out.log
How do I use the execute process task? I am trying to unzip the file using the freeware PZUnzip.exe and I tried to place the entire command in a batch file and specified the working directory as the location of the batch file, but the task fails with the error:
SSIS package "IngramWeeklyPOS.dtsx" starting.
Error: 0xC0029151 at Unzip download file, Execute Process Task: In Executing "C:ETLPOSDataIngramWeeklyUnzip.bat" "" at "C:ETLPOSDataIngramWeekly", The process exit code was "1" while the expected was "0".
Then I tried to specify the exe directly in the Executable property and the agruments as the location of the zip file and the directory to unzip the files in, but this time it fails with the following message:
SSIS package "IngramWeeklyPOS.dtsx" starting.
Error: 0xC002F304 at Unzip download file, Execute Process Task: An error occurred with the following error message: "%1 is not a valid Win32 application".
The command in the batch file when run from the command line works perfectly and unzips the file, so there is absolutely no problem with the command, I believe it is just the set up of the variables on the execute process task editor under Process. Any input on resolving this will be much appreciated.
I am designing a utility which will keep two similar databases in sync. In other words, copying the new data from db1 to db2 and updating the old data from db1 to db2.
For this I am making use of the 'Tablediff' utility which when provided with server name, database, table info will generate .sql file which can be used to keep the target table in sync with the source table.
I am using the Execute Process Task and the process parameters I am providing are:
The customer.bat file will have the following code: tablediff -sourceserver "LV-SQL5" -sourcedatabase "TC_CTI" -sourcetable "CUSTOMER_1" -destinationserver "LV-SQL2" -destinationdatabase "TC_CTI" -destinationtable "CUSTOMER" -f "c:SQL_bat_Filessql5TC_CTIsql_filescustomer1"
the .sql file will be generated at: C:SQL_bat_Filessql5TC_CTIsql_filescustomer1.
The Problem: The Execute Process Task is working fine, ie., the tables are being compared correctly and the .SQL file is being generated as desired. But the task as such is reporting faliure with the following error :
[Execute Process Task] Error: In Executing "C:SQL_bat_FilesSQL5TC_CTIpackage_occurrence.bat" "" at "C:Program Files (x86)Microsoft SQL Server90COM", The process exit code was "2" while the expected was "0". ]
Some of you may suggest to just set the ForceExecutionResult = Success (infact this is what I am doing now just to get the program working), but, this is not what I desire.
I'm pulling data from Oracle db and load into MS-SQL 2008.For my data type checks during the data load process, what are options to ensure that the data being processed wouldn't fail. such that I can verify first in-hand with the target type of data and then if its valid format load it into destination table else mark it with error flag and push into errors table... All this at the row level.One way I can think of is to load into a staging table then get the source & destination table -column data types, compare them and proceed.
should I just try loading the data directly and if it fails try trouble shooting(which could be a difficult task as I wouldn't know what caused error...)
Hallo, if a row is locked in SQLExpress and an other user want to update this row, SQLExpress waits until timeout. And than I get the errormessage "timeout".
What I need is an immediately errormessage "row is locked" !
So that the user dont need to wait if the row is locked and I need the correct errormessage so that I can check if there is a databaseproblem or only the row locked because an other user is editing the same rows.
I have a master package, which executes child packages that are located on a SQL Server. The Child packages execute other child packages which are also located on the SQL server.
Everything works fine when I execute in process. But when I set the parameter in the mater package ExecutePackageTask to ExecuteOutOfProcess = True, I get the following error
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "Row Count" (5349).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Custom Split" (6399).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Data Source" (5100).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "DST_SCR Load Data" (6149).
The child packages all run fine when executed directly, and the master package runs fine if Execute Out of Process is False.
Working with SQL 7 and Visual Basic 5.0: We have experienced a lock situation and we do not know how it happened nor how to solve it:
When trying to update a record we get the following message.
Run time Error 3197: The Microsoft Jet Database engine stopped the process because you and another user are attempting to change the same data at the same time.
And for sure, ONLY one user is connected at the time. Apparently, the record is marked and can be read but CAN NOT be updated. Fortunately the record CAN be deleted !!!
The only way we found was using the VISDATA.exe that comes with Vbasic, to delete the record and add a new record with the same information.
Since this condition makes any updating program to be aborted, it's a big problem. How this happened? Is there any way to prevent this? Is there a way to detect this in advance? Is there a way to correct this situation automatically, such as rebuild or check the Data Base?
Your advise will be greatly appretiated
TIA Gerardo Alvarez asaca@asaven.com asaca@telcel.net.ve Bienvenido a nuestro Site en http://www.asaven.com
Hi-I've been trying to install SS2K Hot Fix # 818095 for one of my clientsand it keeps terminating - the Log for the install contains an errormessage that reads "...dbmslpcn.dll is WRITE LOCKED".I stopped all but a few of the services on the box, but still can'tseem to get it installed.Any suggesions??Thanks in advance for any help on this issue, it's been a royal PITA!Pete
Someone told me they closed there window and lost their query and wanted to know if there was some way to get it back from a buffer or something. Does anyone know if that is possible? I was not aware of anything like that, but the way that SQL caches and buffers things I thought it might be possible extract a query from a buffer or something...?
Occasionally on my SQL 2K SP3 Standard servers. The servers lock up withthe error.LazyWriter: warning, no free buffers found.After that I have to restart the SQLServer service to get things up andrunning. How can I prevent this from happening?Thanks,Ray
I am looking for an API to flush all data in memory held by SQL Serverto disk. Also, is there a tool for SQL Server like eseutil forExchange that lets you correct a SQL database?
I have corrected the problem with named pipes versus TCP/IP but I have found a reference to lrustats stating that page flushes should be less than 100 and free avg less than 10 but my numbers are extremely high. One of the things I noticed upon assuming this job was that many settings were off a bit bit LogLRU buffers were never set.This setting was introduced with service pack 2 with a readme.txt on how to configure. All my references and the ones that I have skimmed through in bookstores do not contain this configuration setting.
Microsoft no longer offers SP2 for download so that I could get the readme.txt. If any of you have this file or service pack laying around, please email it to me so that I can configure. It is suppose to take care of some bufwait errors. I even download the French version to see if it was there and maybe I could find someone with a french background but it did not contain the file. It was in last years Technet CD but my company tossed it when they got the new one. Can anyone help?
hi We have use SQL Server 2005 on 64bit windows 2003 server Cluster. SQL instance stop responding for some leaving application interrupted. In the Sql Err log I noticed the following information memory errors just before cluster issued stop command to SQlsrvres. Lazywriter gave warning that no free buffers found .
I would like to have a SSIS package which loops through each xml file (.xml files) in a folder on the network. And then for each file pull out the data and insert into a sql server table. Please kindly guide me through this i.e. What task(s) are required, etc. Thanks
There is a function called "proactive caching" in Analysis services. It can: ----Automatic synchronization with the relational database ----No more explicit "cube processing
But I cannot have the latest data in the cube even I set the proactive mode as "real time"
Do I need SSIS to process cube in this case?
Following is the procedures I have done: 1. test the data 1.1 use the bi dev studio to browser the cube, ensure no new data are there 1.2 process the the cube and browser the data, ensure new data are there 1.3 delete new data from source database and reprocess the cube, ensure no new data are there 1.4 add new data again
2. configure the proactive setting of cube 2.1 use sql server management studio to open the cube and open the properties window 2.2 in the option of "proactive caching" select "low-latency MOLAP" (even real-time ROLAP later), then click ok
3. configure the proactive setting of cube 3.1 open the patitions view properties window
3.2 in the option of "proactive caching" select "low-latency MOLAP" (even real-time ROLAP later), then click ok
3.3 in the notification tab, select "sql server " and specifiy tracking tables to the "fact table", which is a view to get data from real fact table.
4. wait a period of time...
5. test the data again 5.1 use the bi dev studio to browser the cube, but no new data are there (even I selected real-time ROLAP later). I even tried the reconnect and refresh options in the tool bar
So my questions are : 1. Did I do the right thing to achieve the target "Automatic synchronization with the relational database "
2. Can I monitor the procedure of synchronization, such as monitoring the log of processing, viewing the schedule setting and status of the process?
OK, I just want to know if I can use SSIS to open one text file create a table with the info in SQL Server. Then open another fixed length file and insert rows in this table. I want to do this from an application in .Net. for e.g. I have a file that says "Col1 String 20, Col2 String 20,". This will create a table "Table1" in database DB1. then it will open a text file that has 200 rows for Col1 and Col2 with fixed length as 20. The table will be filled with the rows in the second text file. I want to give the user the ability to select the above files and when he clicks submit, the table will be filled. Is it possible?
I'm working with SSIS and I would add the Integration Service project to the daily build process but I need to know how to generate the <name>.SSISDeploymentManifest other then invoking the devenv.exe.
I have a command to decrypt a file that I can run from the command line and it works beautifully. However, when I stuff it into an execute process task, it errors out every time or does nothing.
Here is the command I can run from the command line:
I've pointed the execute process task object to the gpg.exe executable on my system and am stuffing the remainder in the arguments line. I have also tried changing around all the timeout settings and sucess values. I have found I can change the success value to 2 and it will show up as being green when complete, but the file doesn't decrypt. It just in turn will throw an error on the next piece because the required file is not there.
I will probably end up writing a script to get this to work and use a script task but I really want to know why this will not work.
I have a ETL ( SSIS ) Process in which i am loading around 150 tables in each run. ( Truncate and Insert ). I have four packages each from different sources. ( Each package loads different tables and different numbers )These are run on weekly basis one after the other. Each package is taking around 60 to 90 minutes each. Now i want to track the progress of the ETL on my front End application.
We want this in two ways.
First Way : I need to show the user how much percent of ETL Process is completed
Second Way : I need to show the No of tables completed and how many rows have been completed in the ongoing table ( which is in process )
It is possible to program part of the process of load of data within the SSIS. The origin is a Flat file (.txt and .dat) and the destiny a SQL Server 2005. All the fields of the file are not mapean origin with the destiny table and data are needed other tables that are in the Data Base.
At First sorry for my English. I have a Problem with a SSIS Project with a Process Task. If I start the Project in Visual Studio then it work with no problems. But when I start it in a Order with the Sql Server-Agent, its run infinitely.
There is no error message, the Server Agent says that the job work, I can see the .exe file in the task manager. but it will no really executed.
Right now I built an SSIS package to transform data from external source into local database server. I schedule it to be processed at that database server (ex Server A). Is there any difference performance if I replaced the SSIS package to be processed at another server (ex Server B) ? I'd like to separate the process because I want to reduce workload in Server A by removing the SSIS process to Server B. Am I correct ?