I need to have an way to automatically submit a sql query in Sql 2000 - the query joins several lables and exports the info to a file. Right now I've been doing it interactively in Query analyzer and using the export to option... I can't use DTS (query way too complicated!) and the file created is >255 lrecl so that seems to be a big problem too!
Any suggestions? Save me from this same predicament on New Years Day! Thanks and Happy Holidays! :(
I was asked this question by someone and I didn't know the answer (hence I'm here and hoping someone else does!). They have an SQL database which is accessed via a web front end (not sure of the language) to return query results. Sometimes the queries take a long time (?) to return, whilst at other times the same query is very much faster. They have heard that it is possible to setup a query in such a way that if it takes more than a certain time to run it will generate an alert. This could then be scheduled as a job to aid diagnostics. Anyone know of a way to do this - measure the time taken to run a query and generate an alert if longer than a given time.
I'm using SSMS 2012. I have multiple queries that are run daily(I also have a few that are run every Monday) that I want to schedule to run at midnight and export to a new xls file on the network drive to be used the next morning. Is there something that I can add to each query to accomplish this or would I have to set something else up to call the queries?
I can easily query multiple servers using the multi-server query function in Central Management Server and write some of the results to logging tables. I would like to be able to do this via a scheduled job. So far I am finding that even setting up Master/Target Servers this may not work and the only workaround is either using SSIS, SQLCMD (by basically hard coding the servername) and possibly Powershell.
tell me if they have been successful just using standard jobs and querying against multiple servers?
If I can't save the results to a 'central' database/table (I can do this when in SSMS), but can still query against multiple servers I was thinking I could write the results to a CSV file that a SSIS job picks up.
I have attempted using SSIS to iterate through servers and have been plagued with intermittent connection issues when using a For...Loop container.
Hi!I scheduled a DTS-Import from MySQL, whenever I run it manually(Right-Click on the DTS package) it runs through without any problems.But firing it by a schedule doesn't work!?Just to exclude any issues regarding users/roles, I created a DTS toexport files to my desktop to an EXCEL-sheet. Manually export as wellas scheduled export works fine.My Application Log shows me following error message:Event Type:WarningEvent Source:SQLSERVERAGENTEvent Category:Job EngineEvent ID:208Date:6/8/2005Time:10:05:02 AMUser:N/AComputer:*****Description:SQL Server Scheduled Job 'ImportFromMySQL'(0xC89612CE034F6642BD585B048DBC0F06) - Status: Failed - Invoked on:2005-06-08 10:05:02 - Message: The job failed. The Job was invoked bySchedule 22 (ImportFromMySQL). The last step to run was step 1(ImportFromMySQL).Anybody know what's wrong!?
I have designed a DTS Package and it can be run successfully from Enterprise Manager. However, when I schedule the DTS package to run as a job then it fails with an error message of "Error string: The system cannot find the file specified.".
Anyone any idea as to why the job cannot find the DTS package?
Can it be something to do with SQL Server 7 and 2000 tools as 2000 Client tools have recently been installed onto my PC and the package and job ran fine when I had designed everything in SQL Server 7.
Hi, I am trying to do an automatic backup of my database and for some reason it does not do it. I have it set to backup daily at 4:00 pm. Please let me know if you know why it is not backing up on it's own. The Server manager is on always and a manual backup is not a problem for me, only the automatic. Thanks very much.
I have a number of DTS packages which when run manually complete successfully however, when run as scheduled tasks they always fail. Can anyone offer any advice?
I'm thinking of using the SQL Agent Job Scheduler as part of a larger application and I'm wondering if anyone knows of a limit on how many schedules or jobs that can exist on a SQL Server at one time.
Hello, I have many dts packages scheduled as jobs . job always fails when executed. It runs fine if i execute the dts package. the follwing is the error message Error: -2147217887 (80040E21); Provider Error: 0 (0) Error string: Errors occurred Error source: Microsoft OLE DB Provider for SQL Server Help file: Help context: 0 DTSRun OnFinish: DTSStep_DTSExecuteSQLTask_1 DTSRun: Package execution complete. Process Exit Code 1. The step failed. any help would be welcome thenk you
When I create a DTS to import data from Visual FoxPro it will work if I run immeadiately, but when I schedule it to run at a specific time it will Fail. Any ideas why??
I am trying to figure out a way , looking at the tables in MSDB (SysJobHistory) that if a scheduled job is running and not completed yet , for how long it has been running. I have to look for all the jobs with run_status = 4 (in process). But what i figured out it no history is written till the job is finished or cancelled. Can anyone help me with this
Hi, we run a nightly job processes, this job depends on the data entered from the frontend, since yesterday we have been entering lot of data in it, so the job that ran last night 10pm(09/05/00) is still running now and its next schedule time 10pm today(09/06/00), if the same job is already running since yesterday and if it still runs till 10pm today, will it starts again as per the schedule or it will not run, since the same job is running since yesterday.
I have a scheduled job that run one time a day at 10 pm. My problem is that if the job fail, i want that the job run 10 minutes after until it complete with success.
In SQL 6.5, when a scheduled job failed, you could see the error message in the history. In SQL 7.0, it simply tells you which was the last step to run. Is there a place which will report the actual error message generated by the task?
I have created a DTS package on a developement server that connects to our Exchange server and downloads customer service e-mail and inserts them into a table. This is done using a VB script.
When I right click on the job and execute it, it runs fine and we can see the mails after they show up in the table.
However, when I schedule the job to run at 15 minute intervals it fails with a vb runtime error. When I copy the job to my personal machine and schedule it, it runs just fine and again we see the mails in the table.
The server has the latest version of the scrrun.dll as well as IE 5.0.
I have messed around with this for 3 days now and have gotten MSFT involved as well.
Anyone seen this before? Any ideas or help will be greatly appreciated.
I am running a scheduled DTS which transfers all the rows in a production table to another server every 30 minutes,each time it truncates the table on the second server before transfer.It has been running fine for several days.Will there be any problem in this kind of backup strategy? Do I have to clear any history logs frequently?Or any other problem can happen? Can anyone suggest any precautions,as there will not be any down time allowed.Replication is also not acceptable by the client. Thanks.
Why is it that I can run my DTS package locally howeverv when I try to run it as a Job it always fails? I do realize that the sqladmin account is used to run the job and it has all the permissions needed.
Any suggestions?
Also if I kick the job off from my local system it states it cannot find the batch file that I am trying to run? Im running it on the server however it treats it like im running it locally?
I have just taken away a tremendous amount of rights from our developers, but would like some of the developers to still have rights to manage the scheduled job. Short of making them a system administrator I can not seem to find a predefined role that will do this. Is there one? And if not what system stored procedures or xp would you need to give them rights to, to view and run the scheduled jobs in enterprise manager.
If I have 2 scheduled tasks set for the same time (perhaps accidentally), will the SQL Executive start 1 and queue the other one until the first is complete and then run the 2nd task? Or will they both be started simultaneously?
Hiya, Got a little question for future reference. I have a cursor that does a grant for roles, users, etc. for all tables in a DB. I can manually run it in query analyzer and it does fine. The issue is that if I try to create it as a SQL Agent job and just paste the script into the command box it returns the error:
Line 1: Incorrect syntax near 'xxx yyy'. [SQLSTATE 42000] (Error 170) Associated statement is not prepared [SQLSTATE HY007] (Error 0) Line 1: Incorrect syntax near 'xxx yyy'. [SQLSTATE 42000] (Error 170). The step failed.
The Cursor is written as follows:
declare @cStatement varchar(255)
declare G_cursor CURSOR for select 'grant select on[' + convert(varchar(64),name) + ']to "xxx yyy"' from sysobjects where (type = 'U' or type = 'V') and uid = 1
set nocount on OPEN G_cursor FETCH NEXT FROM G_cursor INTO @cStatement WHILE (@@FETCH_STATUS <> -1) begin EXEC (@cStatement) FETCH NEXT FROM G_cursor INTO @cStatement end DEALLOCATE G_cursor
I figure this has something to do with the quoted_identifier option or something simple like that, but I can't put my finger on it....In Query Analyzer this will error out if I don't have double quotes around the DB role "xxx yyy" because of the space in the role name.
I corrected the error by recreating the role name without a space, but I have some other places I'd like to be able to use this where I won't have the luxury of recreating the role if it has a space in the name.
I have the below code which works to rebuild indexes on a large db. I can run it from QA, but not as a job or as a sp in a job. I get the same error
Executed as user: ADCsqlexec. Retrieving Table List for DB Development [SQLSTATE 01000] (Message 0) ReIndexing Table Development..cms_appointments [SQLSTATE 01000] (Message 0) DBCC execution completed. If DBCC printed error messages, contact your system administrator. [SQLSTATE 01000] (Message 2528) Updating Statistics on Table Development..cms_appointments [SQLSTATE 01000] (Message 0) Could not complete cursor operation because the set options have changed since the cursor was declared. [SQLSTATE 42000] (Error 16958). The step failed.
/* Create DB List */ DECLARE DBCursor CURSOR FOR SELECT name FROM master..sysdatabases where name = 'development'
OPEN DBCursor
FETCH NEXT FROM DBCursor INTO @DBName
/* Create Database Loop */ WHILE @@FETCH_STATUS = 0 BEGIN /* Retrieve Table List */ PRINT 'Retrieving Table List for DB ' + @DBName
EXEC ('SELECT name AS TableName INTO ##TableNames FROM [' + @DBName + ']..sysobjects WHERE type = ''U''')
/* Open Table List */ DECLARE TableCursor CURSOR FOR SELECT TableName FROM ##TableNames
OPEN TableCursor
FETCH NEXT FROM TableCursor INTO @TableName
/* Create Table Loop */ WHILE @@FETCH_STATUS = 0 BEGIN /* Add DB Name to Table Name */ SELECT @FQTableName = QUOTENAME(RTRIM(@DBName)) + '..' + QUOTENAME(RTRIM(@TableName)) SELECT @TableName = RTRIM(@DBName) + '..' + RTRIM(@TableName) /* fix from ms */ SET ARITHABORT ON SET QUOTED_IDENTIFIER ON -- DBCC CHECKTABLE(mytable)
I have been running the following production job successfully for a long time. It now fails, and the Task History Last Error Message displays 'No Message'. The log file ( C:MSSQLLOGMaint_TombV50.txt) shows it ran successfully, with a Return Code 0.
I'm new to SQL so this should be an easy question. All i want to do is create a recuring task. I want to copy data from one SQL server to another. I've created a package to accomplish this and when i execute it manually it works. What doesn't work is scheduling to execute a regular intervals. As a test i set it up to execute every minute every day (This however will change to once a day when i prove it works) and for some reason it never executes. Any ideas? By the way i created the package using the DTS export wizzard.
To all, If I have a scheduled tasks that is owned by 'sa', how can I assign permissions to allow another user, even the database dbo, to register the SQL server and view the scheduled tasks?
I find some scheduled jobs are switching from an 'enabled' state to a 'disabled' state apparently for no good reason. The job itself still shows as being 'enabled', however the associated scheduled becomes 'disabled'. Does anyone why this would be? Is the problem associated with a paritcular service pack or anything??
I created a scheduled task on SQL server 6.5 which is actually dump system DB.
The problem I have is the scheduled task did not run with no error messsage returned. I have tried to force it run in different schedule modes. Nothing happened. However I can dump system DB through SEM Backup/Restore which runs OK!
Any body has idea why scheduled task does not run ?
I am trying to set up a DTS to transfer logging data from one server to another. The record may already exist at the destination causing a primary key violation. I do not want this error to cause the entire DTS to fail.
When I execute the DTS I created by right clicking and selecting "Execute Package" it shows me 2 errors. Although there are 2 errors the rows that do not have a primary key violation are successfully transfered to the destination database. Here are the 2 errors I see:
Error 1: Error at Destination for Row number 97. Errors encountered so far in this task: 97. The statement has been terminated. Violation of PRIMARY KEY constraint 'PK_event'. Cannot insert duplicate key object 'event'.
Error 2: Error at Destination for Row number 198. Errors encountered so far in this task: 198. The statement has been terminated. Violation of PRIMARY KEY constraint 'PK_eventDetail'. Cannot insert duplicate key object 'eventDetail'.
These errors make sense, there were 97 duplicate lines in the event table and 198 duplicates in the eventDetail table.
This is the behavior I want. New rows are copied to the destination database.
When I schedule the DTS as a Job in the Enterprise manager things change. When the DTS is executed as a Job (as opposed to me right clicking and selecting "Execute Package"), the job reports a failure and none of the new rows are transfered to the destination database.
Why does the DTS transfer the rows that do not violate the Primary Key constraint when I manually execute it and not when it is executed as a job?
I've scheduled a job to run on a certain schedule, but the Last Run Status date comes back very oddly, a couple years out of synch, the other jobs scheduled report back just fine.
Anyone seen this behavior?
Edward R Hunter, Data Application Designer comScore Networks, Inc.