Ok, I'm somewhat familiar with Exchange, but I'm totally new to SQL.
Here's my problem:
Each night I have a batch file shut down the SQL processes. I then copy the relevant files to another hard drive for back up purposes. I have 2 databases. The first database file and transaction log goes just fine and both files copy. The second database file goes just fine as well, but the transaction log returns "incorrect Parameter". It does this even if I simply try to right click & copy the file in windows with the SQL processes down so I've ruled out it being the syntax of my batch file.
The .LDF for this database is huge (to me at least), it weighs in at +50GB's and is growing 2-4 GB's a week. Is it safe to simply delete this transaction log and start a new one since I have a good backup of it's associated database? I'm not exactly sure what caused the incorrect parameter errors either. I inherited this problem so my background info is limited.
I am getting an error the parameter is incorrect when using a parameter within a case statement. Here is my SQL: SELECT J.Commitment, J.Vendor, A.Name, J.Category, J.Transaction_Type, J.Job, CASE WHEN J.Transaction_Type = 'AP cost' AND J.Accounting_Date < @MonthEndDate THEN J.Amount END AS InvoicedtoDate, C.Date, C.Delivery_Date FROM JCT_CURRENT__TRANSACTION AS J INNER JOIN JCM_MASTER__COMMITMENT AS C ON J.DBID = C.DBID AND J.Commitment = C.Commitment LEFT OUTER JOIN APM_MASTER__VENDOR AS A ON J.DBID = A.DBID AND J.Vendor = A.Vendor WHERE (J.Commitment <> ' ') AND (J.Job <> ' ') ORDER BY J.Job, J.Commitment If I replace @MonthEndDate with a date value it will run but as soon as I run the SQL with the parameter it gives me the error the parameter is incorrect. @MonthEndDate is data type datetime. Thanks for your help.
I'm having a problem with my SQL 7.0 (SP2) server running on NT4 (SP6a). When I use Enterprise Manager on my NT4 workstation (SP6) and I try to double click a local DTS package to go into the design view I get the following error message:
Error Source: Microsoft Data Transformation Services (DTS) Package Error Description: The parameter is incorrect
If I log on to the actual server and run Enteprise Manager everything is o.k It is not a permissioning problem as I'm logging in with domain Admin in both cases. Any ideas ?
I have looked through the forum but not found the solution.
I have a OLE DB source connected to the SCD component. I have tried to change the Data Access mode from reading the table and also a 'select * from table' SQL command in the OLE DB source editor, but neither worked.
The SCD component fails with :-
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80070057 Description: "The parameter is incorrect.".
Error: 0xC0047022 at Extract revenue codes, DTS.Pipeline: The ProcessInput method on component "Slowly Changing Dimension" (12964) failed with error code 0xC0202009. The identified component returned an error from the ProcessInput method.
I'm running SqlServer 2000 and am trying to write a DTS. For some reason I keep getting that error for my Dynamic Property Task and not sure why. I've tried giving my Package a password like was mentioned in another thread on here, but that didn't work either.
I'm trying to dynamically set a Data Source for an Excel connected, using a query.
Query:
DECLARE @directory VARCHAR(1000) SET @directory = 'C:' EXEC [Legacy].[dbo].[spr_GetNewestExcel] @directory
The Store Procedure:
CREATE PROCEDURE spr_GetNewestExcel ( @directory VARCHAR(1000) ) AS BEGIN DECLARE @cmdParam VARCHAR(8000)
--Temp Table that saves the output of a DOS command. CREATE TABLE #output ( output VARCHAR(300) )
--Temp Table that saves the output of a "dir" DOS output, in parts. CREATE TABLE #files ( fileDate VARCHAR(20), fileSize VARCHAR(20), fileName VARCHAR(100) )
--Sets the command parameter SET @cmdParam = 'dir "' + @directory + '"'
--Inserts the output of the above DOS command into a table INSERT INTO #output EXEC master..xp_cmdshell @cmdParam
--Parses the #output table for the files with the correct extension and breaks it up into --Date, Size, and Name. INSERT INTO #files SELECT SUBSTRING(output, 0, 21) As fileDate, LTRIM(SUBSTRING(output, 21, 18)) As fileSize, SUBSTRING(output, 40, LEN(output)) FROM #output WHERE output LIKE '%.xls'
--Displays the values SELECT TOP 1 CASE WHEN SUBSTRING(@directory, LEN(@directory), 1) = '' THEN @directory + fileName ELSE @directory + '' + fileName END AS fileName FROM #files ORDER BY fileDate DESC
DROP TABLE #output DROP TABLE #files END
The Stored Procedure runs great on its own, it displays a list of filenames with the .xls extension in the desired directory, which is what I want. It should put the first in the list((if there is more than one)) as the datasource, but it doesn't.
When I go to the "Add/Edit Assignment" window for it, and push "Refresh" to preview it, it gives me a "The Parameter Is Incorrect".
Any help would be much appreciated, this has had me stumped for quite some time.
Hi Help with syntax, I get the error in the line: myDA.Fill(ds, "t1") Function GetProductsOnDepartmentPromotionPaging(ByVal departmentId As String) Dim myConnection As New _ SqlConnection(ConfigurationSettings.AppSettings("ConnectionString")) Dim myDA As New SqlClient.SqlDataAdapter _ ("MM_SP_GetProductsOnDepartmentPromotion", myConnection)
' Add an input parameter and supply a value for it myDA.SelectCommand.Parameters.Add("@DepartmentID", SqlDbType.Int, 4) myDA.SelectCommand.Parameters("@DepartmentID").Value = departmentId Dim ds As New DataSet Dim pageds As New PagedDataSource myDA.Fill(ds, "t1") pageds.DataSource = ds.Tables("t1").DefaultView pageds.AllowPaging = True pageds.PageSize = 4 Dim curpage As Integer If Not IsNothing(Request.QueryString("Page")) Then curpage = Convert.ToInt32(Request.QueryString("Page")) Else curpage = 1 End If pageds.CurrentPageIndex = curpage - 1 lblCurrpage.Text = "Page: " + curpage.ToString() If Not pageds.IsFirstPage Then lnkPrev.NavigateUrl = Request.CurrentExecutionFilePath + _ "?Page=" + CStr(curpage - 1) End If If Not pageds.IsLastPage Then lnkNext.NavigateUrl = Request.CurrentExecutionFilePath + _ "?Page=" + CStr(curpage + 1) End If list.DataSource = pageds list.DataBind() End Function
I have some DTS Shedules on SQL 7.0 SP1 . All shcedule was running properly . When I made some changes in an SQL Task and saved it the shedule starts to fail. It gives me an error "DTS RUN LOAding Error- 2147024809(80070057)Ptovider Error (0) Error String : The parameter is incorrect etc) But All my DTS is running smoothly without error Only the schedule fails. Please help me.
I created a report in Reporting Services 2005 where I added multi-value parameters. When I run my report, and try to select more than one parameter, I get an error: Incorrect syntax near ','
WHERE (Cono = @Company) AND (DATEPART(month, PaymentDate) = @Month) AND (DATEPART(year, PaymentDate) = @Year)
If I run the job in preview mode I enter the parameter data as requested and it runs correctly. When I go back to design mode and run the query using the ! (the parameter box pops up with the data I entered in preview mode) I get an error message - The Parameter is incorrect.
I've tried setting the parameters to every combination of string/interger I can think of.
Could someone please help me!!! I am using a multi-value parameter in SQL 2005 reports and am getting the following error message:
An error has occured during report processing. Qiery execution failed for dataset an expression of non-boolean type specified in a context where condition is expected, near ',' Incorrect syntax near keyword else.
The multi-value parameter works when it isn't run in the if, else clause i checked the where clause with a single paramter and it works OK. I don't understand what is causing this problem but I really need to fix it. Here is my query.
if @job_SubRep_ProjNo_param = '0'
Begin select Job_Job_No as job, Job_Job_Name as Job_title, cast(Job_Total_Fee as float) as fee, employee_first_name + ' ' + employee_surname as jl_name, cast(Job_Percent_Complete as float) as percentcomplete, cast(Job_Work_Done as float) as workdone, cast(Job_Invoicing as float) as job_Invoicing, job_WIP, Job_Expenditure as timecost, job_project_no, Job_Profit_Loss, cast(Job_Hours as float) as hours, job_expenses from job_tbl inner join project_tbl on job_project_no = project_no inner join employee_tbl on job_jl_empno = employee_no where project_pl_empno = @pl_employeeNo or job_jl_empno = @jl_subRep
end else select Job_Job_No as job, Job_Job_Name as Job_title, cast(Job_Total_Fee as float) as fee, employee_first_name + ' ' + employee_surname as jl_name, cast(Job_Percent_Complete as float) as percentcomplete, cast(Job_Work_Done as float) as workdone, cast(Job_Invoicing as float) as job_Invoicing, job_WIP, Job_Expenditure as timecost, job_project_no, Job_Profit_Loss, cast(Job_Hours as float) as hours, job_expenses from job_tbl inner join project_tbl on job_project_no = project_no inner join employee_tbl on job_jl_empno = employee_no where job_project_no in (@job_SubRep_ProjNo_param) or job_jl_empno = @jl_subRep
We're currently trying to evaluate SQLJDBC 2005 1.1 June CTP's support for database mirroring automatic failover. Unfortunately we're getting unexpected exceptions for calls that work fine w/ jtds that our blocking our ability to perform these evaluations without us making substantial changes to our codebase.
The first issue is with the name used when calling a stored procedure -- SP names that start with "dbo." give us the following error:
com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near '.'. src:{call dbo.xyz(?,?,?,?,?,?,?,?)}
The call will work if we change the SQL statement to {call xyz(...)}. I don't understand why we would need to do this, especially given that the documentation for the driver shows call statements with the "dbo." prefix.
We're also having problems using named parameters with stored procedures (for both in and out parametes). Our code has parameter names of the form "@param" as is standard with TSQL (and is required when using jtds). However, this won't work with SQLJDBC -- it only seems to accept parameter names w/o the leading "@". Why is this so?
Finally, we were able to cause a NullPointerException within the driver due to an incorrectly built Properties object that contained an Integer for loginTimeout instead of a String: java.lang.NullPointerException at java.util.Hashtable.put(Hashtable.java:396) at java.util.Properties.setProperty(Properties.java:128) at com.microsoft.sqlserver.jdbc.SQLServerDriver.fixupProperties(Unknown Source) at com.microsoft.sqlserver.jdbc.SQLServerDriver.mergeURLAndSuppliedProperties(Unknown Source) at com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(Unknown Source) While this was due to a bug in our code I would think that such common errors would be better handled.
I have a partitioned view containing 4 tables (example follows at end)
The query plan generated on a select correctly accesses just one of the tables
The query plan generated on an update always accesses all four of the tables. I thought that it should only access the partition required to satisfy the update. Can anyone please advise whether: a) Is this is expected behaviour? b) Is the partitioned view incorrectly configured in some way? c) Is there is a known bug in this area
Note that the behaviour is the same with SP1 on SQL2000
I would be very grateful for any advice
Thanks
Stefan Bennett
Example follows
--Create the tables and insert the values CREATE TABLE Sales_West ( Ordernum INT, total money, region char(5) check (region = 'West'), primary key (Ordernum, region) ) CREATE TABLE Sales_North ( Ordernum INT, total money, region char(5) check (region = 'North'), primary key (Ordernum,region) ) CREATE TABLE Sales_East ( Ordernum INT, total money, region char(5) check (region = 'East'), primary key (Ordernum,region) ) CREATE TABLE Sales_South ( Ordernum INT, total money, region char(5) check (region = 'South'), primary key (Ordernum,region) ) GO
--create the view that combines all sales tables CREATE VIEW Sales_National AS SELECT * FROM Sales_West UNION ALL SELECT * FROM Sales_North UNION ALL SELECT * FROM Sales_East UNION ALL SELECT * FROM Sales_South GO
--Look at execution plan for this query -- This correctly only accesses the South partition SELECT * FROM sales_national WHERE region = 'south'
-- Look at execution plan for update -- This accesses all partitions - Why? update sales_national set total = 100 where ordernum = 23456;
Hi, I have a query written in SQL 2000 which returns incorrect result. The query uses left outer join and a view. I read an issue related to this in one of microsoft bug report in this article http://support.microsoft.com/kb/321541.
However, there's a slight difference in the sympton second bullet wherein instead of a expression the query returns a fixed string for one of the column value.
Although the issue mentioned in article seems to be fixed. The later one still seems to be reproducible even with Service Pack 4. However, this issue doesn't appear in SQL Server 2005.
Here's the query to reproduce this error.
Code Snippetcreate table t1 (pk1 int not null,primary key (pk1)) create table t2 (pk1 int not null,label1 varchar(10) not null,primary key (pk1)) go insert into t1 values (1) insert into t2 values (2, 'XXXXX') go create view V as select pk1, 'ZZZZ' as label1 from t2 go select A.pk1 as A_pk1, B.pk1 as B_pk1, B.label1 as B_label1 from t1 as A left outer join V as B on A.pk1 = B.pk1 go
This query is similar to the one mentioned in the article except that in the SELECT clause of CREATE VIEW statement I am passing a fixed value for column "label1".
I just want to confirm that this is an issue and no fix is available for this so far.
Hello Everyone I've created a report with a simple dataset that is similar to this
City , RequestID, Request Amount, ClaimID, ClaimAmount El Monte 791 52,982.00 2157 41,143.75 El Monte 3691 11,838.00 3140 8,231.14 El Monte 3691 11,838.00 3141 990.00 El Monte 3691 11,838.00 3142 2,615.00
So I group by City, RequestID. On the first group I specified the expression to be City and in the header I list the city and in the footer I list the sum of Request amount. On the second group I specified the group by Request so in the header I placed requestID and on the footer I placed Request Amount. I set request information to hide the duplicates and I even add =Sum(Fields!RequestApprovedGrandTotal.Value,"GroupByRequestID") the scope of the group. But this is what I get:
For requestID = 3691 for Request Amount is 35,514.00 not 11,838.00. All the claim sums are correct and they are located on the detail row.
I've read that a work around is to create multiple dataset but I honestly believe that something as simple as this should work on the reporting server 2000. So I've come to the conclusion that I must be doing something wrong. Can someone give me a hand on this. Thanks.
HELP! im running this sql and get the following error - If i manually start the dts package it is fine....also if i use DTSrun from my pc it also works ok..
I'm having an issue shrinking a transaction log, and am hoping that one of you guys could give me some advice....
Currently, this database (DB1) has approximately 20GB of data, and my transaction log has grown to 22GB.
The database is in Simple Mode (although may have been created as Full, then changed)
I have a Maint. Plan backing up the DB daily, but noticed today that this has not been working for the last week :'(
There are 3 other databases on this same server, which are similar sizs in GB, but have tiny Trans Logs - their backups have of course been working fine.
I've backed up DB1 this morning - a whopping 42GB file - then issued a "dbcc shrinkfile", followed by a "backup log ... truncate_only", followed by another "shrinkfile", but the log is still 22GB
My database log file is now 53M, is it a way to minimize it, delete it or erase it, this log file is to big for nothing, I try to shrink it and delete it manually, but this corrupt the database, can someone help me thx ?
We have a Microsoft SQL Server 2000 SP3 running database for Microsoft Navision 3.7
From time we encounter problems, especially when running heavy query procedures from Navision, with the transaction log. It's actually setup as folows:
We get the following errors (once every 2-3 months so far):
The log file for database 'ME_Prod' is full. Back up the transaction log for the database to free up some log space..
in between numerous abovementioned messages I have the following: Configuration option 'show advanced options' changed from 1 to 1. Run the RECONFIGURE statement to install..
Could not write a CHECKPOINT record in database ID 9 because the log is out of space.
Automatic checkpointing is disabled in database 'ME_Prod' because the log is out of space. It will continue when the database owner successfully checkpoints the database. Free up some space or extend the database and then run the CHECKPOINT statement.
The size of my transaction log file is out of control. I've backed up the database and the transaction log went from 120 GB to 120 MB. Now, I can't reduce the size of the transaction log file. It's still at 120 GB (w/ almost all of that being held as Free Space). I get errors when I try to manually reduce the file size. Any tips?
Hi All:I am getting an error when trying to open a recordset in SQL Server 2000.The error states that the transaction log is full. Is there any way I canclear out or empty the transaction log, or get rid of it alltogether as itis not really needed?Any help would be appreciated.Thanks and regards,Ryan
I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.
If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.
I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.
set XACT_ABORT ON Begin distributed Tran update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 and DONE = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 COMMIT TRAN
It's got me stumped, so any ideas gratefully received.Thx
Hey Folks! I have a typical requirement by my client. On submitting a Update (Bulk) button a huge database operation starts. A huge bulk update operation need to be performed. This would take 2-3 minutes some times. Client wants a cancel button in this case where he can be given a way to cancel the database Transaction. Please let me know in case if there is a way out. Thanks, in advance. Regards, Uday.D
Hi All:I am getting an error when trying to open a recordset in SQL Server 2000.The error states that the transaction log is full. Is there any way I canclear out or empty the transaction log, or get rid of it alltogether as itis not really needed?Any help would be appreciated.Thanks and regards,RyanThe error msg is: tempdb transaction log is full. B/U transaction log tofree up space..."
Is it possible to load both the SQL 7 database and transaction logbackups to SQL 2000 ? I assume it will perform the upgrade during theload.Thanks,James
(I may be in the wrong forum.) How do I obtain/find the properties of a table using SQL Query Analyer (SQL Server 2000)? Specifically, I would like to run a query to find the most recent date of any transaction on a table. I have a script that I use for SQL Server 2005 but it doesn't work in 2000. I don't know 2000 but I'm guessing that the syntax is different?
Here's the 2005 SQL Server script (stolen from 2005 Books Online BTW):
Just curious if anyone out there has had success running a Distributed Transaction on Windows 2000 Professional SP4 lately?
I have come to the conclusion that is is virutally impossible. I am running MSDE on a Win2k Pro machine and trying to run a distributed transaction via linked server and am getting the
"The operation could not be performed because the OLE DB provider 'SQLOLEDB' was unable to begin a distributed transaction."
error. Yes I have been scouring the web looking for answers and have implemented pretty much everything I have run accross... and Yes the MSDTC service is started on both machines... BTW: If I run the procedure on XP it works great!!!
I am wondering if anyone even uses Win2k Pro anymore let alone trying to run a distributed tarnsaction on the darn thing.
If anyone out there has it running let me know I would love to chat with you for a minute or two.
I know i can use the sentence SET IMPLICIT_TRANSACTIONS ON in a Stored Procedure to force SQL Server to set the connection into implicit transaction mode.
Have i a sentence or configuration to force all SQL Server connections to implicit transaction mode?
I have been struggling with a problem for the last couple days now regarding MSTD and distributed transactions. The main issue is that 3 servers are each sitting in a different domain. Non of the domains trust each other.
As of yesterday, I could get all the SQL 2005 chatting to each other using MSDTC without a problem. However, I have a SQL 2000 box which refuses to work. A normal query across a linked server works fine. A distributed transaction will not work. This is the case from A to B and B to A.
I have tested with DTCPing and it says all is ok.
The error messages are:
SQL 2000 > SQL 2005
OLE DB provider "SQLNCLI" for linked server "T-ServerTServer" returned message "No transaction is active.".
Msg 7391, Level 16, State 2, Line 2
The operation could not be performed because OLE DB provider "SQLNCLI" for linked server "T-ServerTServer" was unable to begin a distributed transaction.
And the other way round:
[OLE/DB provider returned message: New transaction cannot enlist in the specified transaction coordinator. ]
OLE DB error trace [OLE/DB Provider 'SQLOLEDB' ITransactionJoin::JoinTransaction returned 0x8004d00a].
Msg 7391, Level 16, State 1, Line 2
The operation could not be performed because the OLE DB provider 'SQLOLEDB' was unable to begin a distributed transaction.
If watching the DTC monitor, it shows an active transaction then goes back to 0.
All instances are on a 4 node cluster with 3 nodes turned off for the testing.
I have run out of things to try. Most of the symptoms for the above messages are to do with DTC security, the Turn RPC Security Off reg hack. All instances and OS (W2K3) are the latest updates / patches.
As for the 3rd server, SQL 2005, it is happy to talk
Anybody got a gem which I can try?
Edit: Both SQL 2005 servers are x64 while the 2000 is x86