Recovery :: Backup File Works Fine From Network But Not When Moved Into Sandbox Environment

Dec 3, 2014

We have a valid full backup of a database. We know it is valid, we have restored it twice from the network with no problems, but we do not have access to the network location from our sandbox environment.

The .bak file is sizable at about 9GB. The .bak file resides on our internal network, in a SAN drive. No problems there. When we copy the file from there into a sandbox environment to attempt the restore in the sandbox environment it gets corrupted. We've tried three different times and all three different times it gets corrupted. First time we copied the file over to the sandbox the restore went up to 50% and failed. The second time we copied the file again and attempted the restore again it failed at 70%.

The third time it failed at 60%. The error message we get during the restore is "...Read on ... failed: 13(The data is invalid) Msg 3013, Level 16, State 1, Line 1 Restore database is terminating abnormally."

Now some background here. To move the file our network team is doing this: they have this .vmdk file that they mount out in our production environment (which has access to the network location where the .bak file is), copy the file into the drive, then move the .vmdk file into the sandbox environment(which does not have access to the network location), mount the drive in the sandbox environment, and then I have access to the .bak file from within the sandbox environment.

Something in the process of using the .vmdk file to move the .bak file from production into the sandbox is causing the file to get corrupted.

View 13 Replies


ADVERTISEMENT

Moved Asp.net App To Network Share, Can't Access SQL

Jan 19, 2007

I've recently moved an asp.net website from my PC to a network share because another tech it going to be working on it.  I finally got the correct permissions on the network share and the correct .NET Framework settings on my PC to be able to run the app.  Now I can't access the SQL server which is on a different server.  Getting the following error:
Request for the permission of type 'System.Data.SqlClient.SqlClientPermission, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed.
How to I setup access to my SQL server for the app from any given PC on my LAN?
 

View 1 Replies View Related

Sql Job Fails But When Run Outside Works Fine

Mar 25, 2008

Hi..

I am stuck at a very awkward place. I have created one package which uses an oracle view as its source for data transfer the problem is when i run the package through dtexec it works fine but when i try to schedule it I get the following error


Error: 2008-03-24 13:52:40.22
Code: 0xC0202009
Source: pk_BMR_FEED_oracle Connection manager "Conn_BMR"
Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft OLE DB Provider for Oracle" Hresult: 0x80004005 Description: "Oracle client and networking components were not found. These components are supplied by Oracle Corporation and are part of the Oracle Version 7.3.3 or later client software installation.

Provider is unable to function until these components are installed.".

I am able to run the package outside the sql job and also connect to the oracle.
I have oracle 9i client installed on the server and sql server is 2005.

Any help would really be appreciated..

View 4 Replies View Related

Job Doesn't Work But Package Works Fine

Jun 26, 2007

hi,

I have many jobs on sql 05 and all work but one. This one writes to an Access DB on the same server as SQL. The package works fine. But when executed in the context of the SQL Agent job, it fails.

Jobs that write to a text file work fine. The Access DB has no password required. By the way, that job in sql 2000 worked fine.

Any ideas?

View 4 Replies View Related

32 Bit DTExec Fails While 64 Bit Works Fine On 64 Bit Machine

Mar 18, 2008

Hi,
I am executing a SSIS package using dtexec. 64 bit version of dtexec works fine. But when i use 32 bit version of dtexec, it fails. i have local admin rights. Following is error description. Please help.


Microsoft (R) SQL Server Execute Package Utility
Version 9.00.1399.06 for 32-bit
Copyright (C) Microsoft Corp 1984-2005. All rights reserved.

Started: 9:24:30 AM
Error: 2008-03-18 09:24:32.54
Code: 0xC0202009
Source: IMALCRM Connection manager "IMAL SRC"
Description: An OLE DB error has occurred. Error code: 0x800703E6.
An OLE DB record is available. Source: "Microsoft OLE DB Service Components" H
result: 0x800703E6 Description: "Invalid access to memory location.".
End Error
Error: 2008-03-18 09:24:32.54
Code: 0xC020801C
Source: Load Fund Detail V_FUND_DETAIL [16]
Description: The AcquireConnection method call to the connection manager "IMA
L SRC" failed with error code 0xC0202009.
End Error
Error: 2008-03-18 09:24:32.54
Code: 0xC0047017
Source: Load Fund Detail DTS.Pipeline
Description: component "V_FUND_DETAIL" (16) failed validation and returned er
ror code 0xC020801C.
End Error
Error: 2008-03-18 09:24:32.54
Code: 0xC004700C
Source: Load Fund Detail DTS.Pipeline
Description: One or more component failed validation.
End Error
Error: 2008-03-18 09:24:32.54
Code: 0xC0024107
Source: Load Fund Detail
Description: There were errors during task validation.
End Error
DTExec: The package execution returned DTSER_FAILURE (1).
Started: 9:24:30 AM
Finished: 9:24:32 AM
Elapsed: 2.078 seconds

View 5 Replies View Related

Remote Connection Tests Fine, But Nothing Works On The Page Itself.

Mar 26, 2008

If this post belongs somewhere else I appologize. I have spent several days trying to solve this problem with no luck. My site is online. Hosted at NeikoHosting. I can connect to the database remotely when adding a datacontrol. It tests fine. But when running the page it won't connect. Even if I go in and change the Web.Config connection string to a local Data Source provided to me by Neiko, it still won't work. It just won't connect. Here are the two connection strings in the Web.Config, minus my login info: Only the remote string will pass testing. Neither works on the site. <add name="yourchurchmychurchDBConnectionString" connectionString="Data Source=MSSQL2K-A;Initial Catalog=yourchurchmychurchDB;Persist Security Info=True;User ID=me;Password=pwd" providerName="System.Data.SqlClient" />
<add name="yourchurchmychurchDBConnectionString2" connectionString="Data Source=66.103.238.206;Initial Catalog=yourchurchmychurchDB;Persist Security Info=True;User ID=me;Password=pwd" providerName="System.Data.SqlClient" />
 Here is the stack trace, if that helps.
[DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied.
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.OleDb.OleDbException: [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied.Source Error:



An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace:



[OleDbException (0x80004005): [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied.]
System.Data.OleDb.OleDbConnectionInternal..ctor(OleDbConnectionString constr, OleDbConnection connection) +1131233
System.Data.OleDb.OleDbConnectionFactory.CreateConnection(DbConnectionOptions options, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningObject) +53
System.Data.ProviderBase.DbConnectionFactory.CreateNonPooledConnection(DbConnection owningConnection, DbConnectionPoolGroup poolGroup) +27
System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection) +47
System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory) +105
System.Data.OleDb.OleDbConnection.Open() +37
System.Data.Common.DbDataAdapter.FillInternal(DataSet dataset, DataTable[] datatables, Int32 startRecord, Int32 maxRecords, String srcTable, IDbCommand command, CommandBehavior behavior) +121
System.Data.Common.DbDataAdapter.Fill(DataSet dataSet, Int32 startRecord, Int32 maxRecords, String srcTable, IDbCommand command, CommandBehavior behavior) +137
System.Data.Common.DbDataAdapter.Fill(DataSet dataSet, String srcTable) +83
System.Web.UI.WebControls.SqlDataSourceView.ExecuteSelect(DataSourceSelectArguments arguments) +1770
System.Web.UI.DataSourceView.Select(DataSourceSelectArguments arguments, DataSourceViewSelectCallback callback) +17
System.Web.UI.WebControls.DataBoundControl.PerformSelect() +149
System.Web.UI.WebControls.BaseDataBoundControl.DataBind() +70
System.Web.UI.WebControls.FormView.DataBind() +4
System.Web.UI.WebControls.BaseDataBoundControl.EnsureDataBound() +82
System.Web.UI.WebControls.FormView.EnsureDataBound() +163
System.Web.UI.WebControls.CompositeDataBoundControl.CreateChildControls() +69
System.Web.UI.Control.EnsureChildControls() +87
System.Web.UI.Control.PreRenderRecursiveInternal() +50
System.Web.UI.Control.PreRenderRecursiveInternal() +170
System.Web.UI.Control.PreRenderRecursiveInternal() +170
System.Web.UI.Control.PreRenderRecursiveInternal() +170
System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +2041

So....... HELP!!!!
 
Thank you!

View 3 Replies View Related

DTS Job Failing Execution When Scheduled, Works Fine Manually.

Feb 6, 2004

My DTS Package work fine if I Execute it manually, but I need to do it automatically just after midnight. I defined my schedule and made sure the job was present in the SQL Server Agent>Jobs, but it fails and the Job History shows the following error:

DTSRun: Loading... DTSRun: Executing... DTSRun OnStart: DTSStep_DTSDataPumpTask_1 DTSRun OnError: DTSStep_DTSDataPumpTask_1, Error = -2147467259 (80004005) Error string: [Microsoft][ODBC Microsoft Access Driver] Cannot start your application. The workgroup information file is missing or opened exclusively by another user. Error source: Microsoft OLE DB Provider for ODBC Drivers Help file: Help context: 0 Error Detail Records: Error: -2147467259 (80004005); Provider Error: 1901 (76D) Error string: [Microsoft][ODBC Microsoft Access Driver] Cannot start your application. The workgroup information file is missing or opened exclusively by another user. Error source: Microsoft OLE DB Provider for ODBC Drivers Help file: Help context: 0 DTSRun OnFinish: DTSStep_DTSDataPumpTask_1 DTSRun: Package execution complete. Process Exit Code 1. The step failed.

Help!!!

View 3 Replies View Related

Query Works Fine Outside Union, But Doesn't Work .. .

Mar 31, 2004

hi all

I have the following query which works fine when it's executed as a single query. but when i union the result of this query with other queries, it returns a different set of data.

any one know why that might be the case??


select top 100 max(contact._id) "_id", max(old_trans.date) "callback_date", 7 "priority", max(old_trans.date) "recency", count(*) "frequency" --contact._id, contact.callback_date
from topcat.class_contact contact inner join topcat.MMTRANS$ old_trans on contact.phone_num = old_trans.phone
where contact.phone_num is not null
and contact.status = 'New Contact'
group by contact._id
order by "recency" desc, "frequency" desc




i've included the union query here for completeness of the question



begin
declare @current_date datetime
set @current_date = GETDATE()


select top 100 _id, callback_date, priority, recency, frequency from
(
(
select top 10 _id, callback_date, 10 priority, @current_date recency, 1 frequency --, DATEPART(hour, callback_date) "hour", DATEPART(minute, callback_date) "min"
from topcat.class_contact
where status ='callback'
and (DATEPART(year, callback_date) <= DATEPART(year, @current_date))
and (DATEPART(dayofyear, callback_date) <= DATEPART(dayofyear, @current_date)) -- all call backs within that hour will be returned
and (DATEPART(hour, callback_date) <= DATEPART(hour, @current_date))
and (DATEPART(hour, callback_date) <> 0)
order by callback_date asc
--order by priority desc, DATEPART(hour, callback_date) asc, DATEPART(minute, callback_date) asc, callback_date asc
)
union
(
select top 10 _id, callback_date, 9 priority, @current_date recency, 1 frequency
from topcat.class_contact
where status = 'callback'
and callback_date is not null
and (DATEPART(year, callback_date) <= DATEPART(year, @current_date))
and (DATEPART(dayofyear, callback_date) <= DATEPART(dayofyear, @current_date))
and (DATEPART(hour, callback_date) <= DATEPART(hour, @current_date))
and (DATEPART(hour, callback_date) = 0)
order by callback_date asc
)
union
(
select top 10 _id, callback_date, 8 priority, @current_date recency, 1 frequency
from topcat.class_contact
where status = 'No Connect'
and callback_date is not null
and (DATEPART(year, callback_date) <= DATEPART(year, @current_date))
and (DATEPART(dayofyear, callback_date) <= DATEPART(dayofyear, @current_date))
and (DATEPART(hour, callback_date) <= DATEPART(hour, @current_date))
order by callback_date asc
)
union
(
select top 100 max(contact._id) "_id", max(old_trans.date) "callback_date", 7 "priority", max(old_trans.date) "recency", count(*) "frequency" --contact._id, contact.callback_date
from topcat.class_contact contact inner join topcat.MMTRANS$ old_trans on contact.phone_num = old_trans.phone
where contact.phone_num is not null
and contact.status = 'New Contact'
group by contact._id
order by "recency" desc, "frequency" desc
)
) contact_queue
order by priority desc, recency desc, callback_date asc, frequency desc

end

View 1 Replies View Related

Multivalue Works Fine In The Sproc But Not In Bids Or Reportserver

Apr 28, 2008

Hi,

I have a report which has multivalue parameters enabled and If i give NULL it displays everything correctly. But if I give different ClientId it doesnt do it in the report.. But if i run my sproc in VS2005 and in ssms it works the way i want it. this is my sproc




Code Snippet
set ANSI_NULLS ON
set QUOTED_IDENTIFIER ON
go
ALTER Procedure [dbo].[usp_GetOrdersByOrderDate]

@StartDate datetime,
@EndDate datetime,
@ClientId nvarchar(max)= NULL
AS
Declare @SQLTEXT nvarchar(max)
if @ClientId is NULL
BEGIN
SELECT
o.OrderId,
o.OrderDate,
o.CreatedByUserId,
c.LoginId,
o.Quantity,
o.RequiredDeliveryDate,
cp.PlanId,
cp.ClientPlanId
--cp.ClientId
FROM
[Order] o
Inner Join ClientPlan cp on o.PlanId = cp.PlanId -- and o.CreatedByUserId = cp.UserId
Inner Join ClientUser c on o.CreatedByUserId = c.UserId
WHERE
--cp.ClientId = @ClientId
--AND
o.OrderDate BETWEEN @StartDate AND @EndDate
ORDER BY
o.OrderId DESC
END
ELSE
BEGIN
SELECT @SQLTEXT = 'Select
o.OrderId,
o.OrderDate,
o.CreatedByUserId,
c.LoginId,
o.Quantity,
o.RequiredDeliveryDate,
cp.PlanId,
cp.ClientPlanId
--cp.ClientId
FROM
[Order] o
Inner Join ClientPlan cp on o.PlanId = cp.PlanId --AND cp.ClientId in ('+ convert(Varchar, @ClientId) + ' )
Inner Join ClientUser c on o.CreatedByUserId = c.UserId
WHERE
cp.ClientId in (' + convert(Varchar,@ClientId) + ')
AND
o.OrderDate BETWEEN ''' + Convert(varchar, @StartDate) + ''' AND ''' + convert(varchar, @EndDate) + '''
ORDER BY
o.OrderId DESC'
exec(@SQLTEXT)
END
--return (@SQLTEXT)




I have 2 datasets in this report one for the above sproc and other dataset that gives me the clientname and it is as follows




Code Snippet

ALTER Procedure [dbo].[usp_GetClientsAll]

@ClientId nvarchar(max) = NULL

AS

--Declare @ClientId nvarchar(max)

SELECT

NULL ClientId,

'<All Clients >' ClientName

FROM

Client

Union

SELECT

ClientId,

ClientName

FROM

Client

Where

ClientId = @ClientId

OR

(

ClientId = ClientId

OR

@ClientId IS NULL

)






In the first dataset Parameter list i have omitted ClientId but kept it in the report parameter.. So when i give select all it works.. but when i just select particular it gives me the same result as Select all..

any help will be appreciated..
REgards
Karen

View 11 Replies View Related

Report Works Fine Stand Alone, But Fails When Used As Subreport

Jun 4, 2007

I have a report which I have tested and works fine. now I'm trying to use it as a subreport. the "outer" or main report is very simple: it just has a company standard banner and some header/footer information, and then a single subreport. there is no passing of parameters between main report and sub report. the subreport does have its own parameter to govern its dataset, and provides its own default for that.



The error that I'm getting is this:



[rsErrorExecutingSubreport] An error occurred while executing the subreport €˜subreport1€™: An error has occurred during report processing.

[rsMissingFieldInDataSet] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Year€™. This field is missing from the returned result set from the data source.

[rsErrorReadingDataSetField] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Year€™. The data extension returned an error during reading the field.

[rsMissingFieldInDataSet] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Month€™. This field is missing from the returned result set from the data source.

[rsErrorReadingDataSetField] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Month€™. The data extension returned an error during reading the field.

[rsMissingFieldInDataSet] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Date€™. This field is missing from the returned result set from the data source.

[rsErrorReadingDataSetField] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Date€™. The data extension returned an error during reading the field.

[rsMissingFieldInDataSet] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Wt_TO_MTD€™. This field is missing from the returned result set from the data source.

[rsErrorReadingDataSetField] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Wt_TO_MTD€™. The data extension returned an error during reading the field.

[rsNone] An error has occurred during report processing.



Of course, this doesn't happen when I execute the subreport by itself. What kinds of things should I be looking at to get to the bottom of this. Thanks!

View 3 Replies View Related

Slow Query....drop Index Works Fine!!!!!

May 11, 2007

We are running MS RS and SQL Server 2000 SP3.

We have one LEDGER, where all the daily activities are stored. The LEDGER table has 4 indexes (1 clustered and 3 non-clustered). To get AR we use this table.



Well problem is some times in 1-2 months, any simple AR query takes a long time and every other client gets slow response (queries are very slow or sometimes block).


If we DROP any index on LEDGER table and again put it back (RECREATE), all our queries work fine and faster. This goes on till 1-2 months, till we see the same issue again.



This is a classic case happened today. Queries were running fine till morning 8 AM. We upload some 50 thousand records to Ledger table (Data Conversion). Well after 30 mins, all simple AR queries started taking a long time. We DROPPED an index in LEDGER table and everything was faster....Just to be same we added back the same index again.......everything is Faster.....



What is this. ....is it our QUERY, index or huge Transactions or no free space ???



We are scheduled to run SP4, next week. But is there any solution in the mean time on what is this?



Also is they any way to KILL all SQL server processes that take more than a mins. We just don't want ALL our client to Slow down because of one query????



Thanks,

View 3 Replies View Related

Using Symmetric Key Problem With Encryption, Decryption Works Fine

May 4, 2006

Hey I had a table with a column of data encrypted in a format. I was able to decrypt it and then encrypt it using Symmetric keys and then updating the table column with the data. Now, there is a user sp which needs to encrypt the password for the new user and put it in the table. I'm not being able to make it work. I have this so far. Something somewhere is wrong. I dont know where. Please help Thanks. I used the same script to do the encryption initially but that was for the whole column. I need to see the encrypted version of the @inTargetPassword  variable. But it's not working. It doesn't give me an error but gives me wrong data...

 

 

declare @thePassword as varbinary(128)

,@inTargetPassword as varchar(255)

,@pwd3 as varchar(255)

,@theUserId bigint

set @theUserId= 124564

set @inTargetPassword = 'test'

OPEN SYMMETRIC KEY Key1

DECRYPTION BY CERTIFICATE sqlSecurity;

Select @pwd3=EncryptByKey(Key_GUID('Key1')

, @inTargetPassword, 1, HashBytes('SHA1', CONVERT( varbinary, [UserObjectId])))

from table1 where UserObjectId= @theUserId

close symmetric key Key1

View 6 Replies View Related

Backup Of Ldf File In Simple Recovery Model

Mar 20, 2008



Hello,



I have a question regarding the backup for the database in Simple Recovery Model.

In this Model, I know we can restore only to the last full backup or can use differential

Backup, if implemented as a part of backup.



But my point of confusion is about the backup of '.ldf' file, should those file should be backed up in the

Maintenance Plan, if yes does it help in reducing the size of Log file?

Do we need the backup of '.ldf' in phase of Restoring?



As I mention my database has Simple Recovery Model, but the size of log file is around 20GB,

Could not understand why as in this Model, normally it automatically truncate the Log file?



Help me to clear my these doubts,



thanks,




View 5 Replies View Related

Problems Publishing My Personal Website - Works Fine Locally!

Jan 24, 2006

People,I'm trying to publish my first website and am having a few problems.I've got Visual Web Developer 2005 Express and am trying to use the Personal Website Starter Kit. (my SQL server is SQL Server Express Edition 2005 - which is also running on my local machine)It seems to work fine when I run it on my localhost, as soon as I ftp it up to my web hosting company, I get an error message (see below) :-An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) My hypothesis is :-It would appear to me that when running locally, the starter kit website uses my installation of SQL 2005 Express Edition, but when I upload all the files, I'm guessing the application is still trying to point at a local instance of SQL on my local PC which it now cannot see. I'm guessing I need to somehow upload the SQL database onto my web host (I've purchased 100M of SQL Server 2005 space), and point the application at that SQL instance instead. But I don't know if I'm right about all this, or indeed how to do it if I am. Can anyone help?Much thanks in advance,Will

View 1 Replies View Related

Works Fine In Designer But When I Load The Report It Doesn't Work

Oct 23, 2006

works fine in designer but when i load the report services
I get the following error
anybody know what to do
there is one subreport with this report
maybe the passing value but what could be wrong ????

Item has already been added. Key in dictionary: '9' Key being added: '9'

View 2 Replies View Related

Long Running Query In SQL 2005 But Works Fine In SQL 2000

Mar 28, 2008

I have a simple update statement that is running forever in SQL 2005 but works fine in SQL 2000. We have a new server we put SQL 2005, restored db. The table in question WEEKLYSALESHISTORY I even re-indexed all the indexes and rebuilt the stats as well. But still no luck, still running extremely long. 1 hour 20 minutes.

I'll try to give you some background on these table. Weeklysalehistory has approx 30 fields. I have 11 indesxes set up weekending date being one of them. And replication control has index on lasttrandatetime as well. So I think my indexes are fine.

/* Update WeekEnding Date for current weeks WeeklySales Records */
Update WeeklySalesHistory set
weekendingdate =
(SELECT LastTransDateTime from ReplicationControl
where TableName = 'WEEKHST')
where weekendingdate is null

Weekly sales has approx 100,000,000 rows
Replication control has 631,000 (Ithink I can delete some from here to bring it down to 100 or 200 records) Although I don't think this is issue since on 2000 has same thing and works fine.


I was trying to do this within SSIS and thought that was issue. I am new so SSIS but it runs long even if I just run it as a job with this simple Update statement so I think its something with tables, etc that is wrong.

One thing on noticed if I look at the statistics in SQL Server Management studio there is a ton of stats. some being statistics on indexes which makes sense then I have a ton of hind_113_9_6 and simiiar one like this. I must have 90 or so named like this. Not sure how to check on SQL 2000 all the stats to see if they moved over from there or what. I checked a few other tables and don't have all these extra stats. Could this be causing the issue do I need to delete all these extras? Any help would be greatly appreciated.


Stacy

View 5 Replies View Related

Permission Issue With Tempdb Works Fine In SQL2000 But Not SQL2005

Sep 17, 2007

the following SQL works fine in SQL2000 but gets a permissions error when run on SQL2005:




IF not exists (SELECT * FROM tempdb.dbo.sysindexes WHERE NAME = 'PK_tblGuidContractMove')

BEGIN

IF @DEBUG = 1 PRINT 'airsp_CopyContracts.PK_tblGuidContractMove'



EXECUTE('ALTER TABLE #tblGuidContractMove ALTER COLUMN guidSource GUID NOT NULL')

EXECUTE('ALTER TABLE #tblGuidContractMove ALTER COLUMN guidDestination GUID NOT NULL')

EXECUTE('ALTER TABLE #tblGuidContractMove ALTER COLUMN guidContractMove GUID NOT NULL')
EXECUTE('ALTER TABLE #tblGuidContractMove WITH NOCHECK ADD

CONSTRAINT [PK_tblGuidContractMove] PRIMARY KEY CLUSTERED

(

[guidSource],

[guidDestination],

[guidContractMove]

) ON [PRIMARY]')



END

The user permissions are set the same in both 2000 and 2005 can you please explain what changed and what are the minimum permissions need for the user to be able to make these changes to the temporary table which the user created.

View 1 Replies View Related

Havin Trouble Inserting Records To A Table.. Update Works Fine

Mar 6, 2008

Hi..
I am getting a xml stream of data and putting it to a object and then calling a big sproc to insert or update data in many tables across my database... But there is one Table that i am having trouble inserting it.. But if i run an update it works fine... This my code for that part of the sproc..
 IF Exists(
SELECT
*
FROM
PlanEligibility
WHERE
PlanId = @PlanId
) BEGIN
UPDATE
PlanEligibility
SET
LengthOfService = Case When @PD_EmployeeContribution = 0 Then @rsLengthOfServicePS
ELSE @rsLengthOfService END,
EligibilityAge = CASE When @PD_EmployeeContribution = 0 Then @EligibilityAgePS Else @EligibilityAge End,
EntryDates = @EntryDates,
EligiDifferentRequirementsMatch = Case When @PD_EmployeeContribution = 0 Then 0
When @PD_EmployeeContribution = 1 and @PD_EmployerContribution = 0 then 0 Else 1 END, --@CompMatchM,
LengthOfServiceMatch = CASE When @MCompanyMatch = 0 Then @rsLengthOfServicePs ELSE @rsLengthOfServiceMatch END,
EligibilityAgeMatch = CASE When @MCompanyMatch = 0 Then @EligibilityAgePS ELSE @EligibilityAgeMatch END,
OtherEmployeeExclusions = @OtherEmployeeExclusions
WHERE
PlanId = @PlanId
END
ELSE BEGIN
INSERT INTO PlanEligibility
(
PlanId,
LengthOfService,
EligibilityAge,
EntryDates,
EligiDifferentRequirementsMatch,
LengthOfServiceMatch,
EligibilityAgeMatch,
OtherEmployeeExclusions
)
VALUES
(
@PlanId,
Case When @PD_EmployeeContribution = 0 Then @rsLengthOfServicePS
ELSE @rsLengthOfService END,--@rsLengthOfService,
CASE When @PD_EmployeeContribution = 0 Then @EligibilityAgePS Else @EligibilityAge End, --@EligibilityAge,
@EntryDates,
Case When @PD_EmployeeContribution = 0 Then 0
When @PD_EmployeeContribution = 1 and @PD_EmployerContribution = 0 then 0 Else 1 END, --having trouble here
CASE When @MCompanyMatch = 0 Then @rsLengthOfServicePs ELSE @rsLengthOfServiceMatch END,
CASE When @MCompanyMatch = 0 Then @EligibilityAgePS ELSE @EligibilityAgeMatch END, --EligibilityAgeMatch,@EligibilityAgeMatch,
@OtherEmployeeExclusions
)
END
 Any help will be appreciated..
Regards,
Karen

View 6 Replies View Related

Works Fine Inside BI Dev. Studio, But Fails When Scheduling It In SQL Server 2005.

Jul 13, 2005

Hi!

View 12 Replies View Related

Intermittent Connectivity Issues Though Network Is Fine

Oct 3, 2006

Hi,

I am getting intermittent connectivity issues while connecting to a remote SQL Server 2000 database through the Enterprise Manager. It gives a time-out error

Initially I guessed it must be the network but then I was able to term-serv successfully into the remote server and work without any connectivity issues.

Any pointers how this is possible ?

Thanks for your help.

Regards,
-Ranjit S Hans.

View 2 Replies View Related

SQL 2000 Partitioned View Works Fine, But CURSOR With FOR UPDATE Fails To Declare

Oct 17, 2006

This one has me stumped.

I created an updateable partioned view of a very large table. Now I get an error when I attempt to declare a CURSOR that SELECTs from the view, and a FOR UPDATE argument is in the declaration.

There error generated is:

Server: Msg 16957, Level 16, State 4, Line 3

FOR UPDATE cannot be specified on a READ ONLY cursor



Here is the cursor declaration:



declare some_cursor CURSOR

for

select *

from part_view

FOR UPDATE



Any ideas, guys? Thanks in advance for knocking your head against this one.

PS: Since I tested the updateability of the view there are no issues with primary keys, uniqueness, or indexes missing. Also, unfortunately, the dreaded cursor is requried, so set based alternatives are not an option - it's from within Peoplesoft.

View 2 Replies View Related

Table Visibility Not Functioning Correctly On Server, Works Fine In Visual Studio

Feb 13, 2007

I have a report problem. I'm using a parameter to dynamically control visibility for two tables. If the parameter is set to one value, I want to switch one of the tables to invisible, if the parameter is set to another, I want the other table to be invisible instead.

This all works fine in Visual Studio. When I publish it to my report server, the visibility controls no longer function and both tables always display. Any ideas here?

I'm running 2005, SP2 CTP.

View 4 Replies View Related

SQLEXPRESS Backup File Losing NETWORK SERVICE User When Copied

Mar 11, 2008

I'm using the methods of the Microsoft.SqlServer.Management.Smo namespace within a .NET application to create a backup file from a SQLEXPRESS database. I can then restore the database from that backup device using methods in the same namespace. Here is a snippet from the restore code:


srv = New Server("MYPCSQLEXPRESS")

db = srv.Databases("washmaster")

Dim bdi As New BackupDeviceItem(BackupFileName, DeviceType.File)

Dim recoverymod As RecoveryModel

recoverymod = db.DatabaseOptions.RecoveryModel

rs.NoRecovery = False

rs.Devices.Add(bdi)

rs.Database = "washmaster"

rs.ReplaceDatabase = True

srv.KillAllProcesses("washmaster")

rs.SqlRestore(srv)


This works great as long as I used one of the backup files that I created directly on the disk. However, my application has a utility that allows the user to copy the backup files onto another drive, such as a CD or a thumb drive and when I try to restore from the copy of the backup, I get the following exception:

....Cannot open backup device..[filename]...Operating system error 5(Access is denied.)

The reason I get this error is that the "NETWORK SERVICE" account was removed from the file permissions when the file was copied.

How can I copy a backup to another drive and preserve the "NETWORK SERVICE" account? If I can't do that, is it wise to try to add the account back to the file before using it to restore or is there a better way?

Thanks,
SJonesy

View 3 Replies View Related

Xp_sendmail: Failed With Mail Error 0x80040111 It Works Fine When You Do A Test From Enterprise Manager

Sep 12, 2007



Hello I am receiving the dreaded mail error listed above. I can send out a test E-mail from Enterprise Manager to operators, but I cannot run this Transact query:

EXEC master.dbo.xp_sendmail @RECIPIENTS = araz***@***.com(removed email address),
@SUBJECT = 'test'

I receive:
Server: Msg 18025, Level 16, State 1, Line 0
xp_sendmail: failed with mail error 0x80040111



I have stopped/restarted the SQL SERVER AGENT but haven't done much else as I haven't been able to.

Should it work through transact SQL if the test email works from Enterprise Manager?

This is SQL 2000 SP4 running on Win2K in the domain. Thank you.

View 8 Replies View Related

Recovery :: Maintenance Task To Copy Bak File To Secondary Drive After Backup Is Complete?

Nov 6, 2015

I need to copy a just-created bak file to another drive after the backup task has completed. I don't see anything in the job toolbox which works with file system operations like this. But still it must be a common need..There are ways to script this or use third part tools but I am looking for something native to the sql server 2012 SSMS toolset, if possible.

An alternate approach would be to run the backup job again, after the main backup, and change the destination to the alternate location. But I was thinking that another backup job would probably invoke more overhead on the server than a simple file copy operation. If I do end up taking this approach I could also use the cleanup task to toss older bak files in the alt dir.

View 7 Replies View Related

RESTORE FILELIST Is Terminating Abnormally Error When Running A DTS Package In SQL 2005. Works Fine In SQL 2000

Oct 10, 2007

Currently I receive the following error when executing script within a DTS package in SQL 2005 (it seems to be working in SQL 2000):


Processed 27008 pages for database 'Marketing', file 'Marketing_Data' on file 5.

Processed 1 pages for database 'Marketing', file 'Marketing_Log' on file 5.

BACKUP DATABASE successfully processed 27009 pages in 15.043 seconds (14.708 MB/sec).

(5 row(s) affected)

Msg 213, Level 16, State 7, Line 1

Insert Error: Column name or number of supplied values does not match table definition.

Msg 3013, Level 16, State 1, Line 1

RESTORE FILELIST is terminating abnormally.

The code I am using is:


-- the original database (use 'SET @DB = NULL' to disable backup)

DECLARE @DB varchar(200)

SET @DB = 'Marketing'

-- the backup filename

DECLARE @BackupFile varchar(2000)

SET @BackupFile = 'C:SQL2005 dbsMarketing.dat'

-- the new database name

DECLARE @TestDB2 varchar(200)

SET @TestDB2 = datename(month, dateadd(month, -1, getdate())) + convert(varchar(20), year(getdate())) + 'Inst1'

-- the new database files without .mdf/.ldf

DECLARE @RestoreFile varchar(2000)

SET @RestoreFile = 'C:SQL2005 dbs' + @TestDB2

DECLARE @RestoreLog varchar (2000)

SET @RestoreLog = 'C:SQL2005 dbs' + @TestDB2

-- ****************************************************************

-- no change below this line

-- ****************************************************************

DECLARE @query varchar(2000)

DECLARE @DataFile varchar(2000)

SET @DataFile = @RestoreFile + '.mdf'

DECLARE @LogFile varchar(2000)

SET @LogFile = @RestoreLog + '.ldf'

IF @DB IS NOT NULL

BEGIN

SET @query = 'BACKUP DATABASE ' + @DB + ' TO DISK = ' + QUOTENAME(@BackupFile, '''')

EXEC (@query)

END

-- RESTORE FILELISTONLY FROM DISK = 'C: empackup.dat'

-- RESTORE HEADERONLY FROM DISK = 'C: empackup.dat'

-- RESTORE LABELONLY FROM DISK = 'C: empackup.dat'

-- RESTORE VERIFYONLY FROM DISK = 'C: empackup.dat'







IF EXISTS(SELECT * FROM sysdatabases WHERE name = @TestDB2)

BEGIN

SET @query = 'DROP DATABASE ' + @TestDB2

EXEC (@query)

END

RESTORE HEADERONLY FROM DISK = @BackupFile

DECLARE @File int

SET @File = @@ROWCOUNT

DECLARE @Data varchar(500)

DECLARE @Log varchar(500)

SET @query = 'RESTORE FILELISTONLY FROM DISK = ' + QUOTENAME(@BackupFile , '''')

CREATE TABLE #restoretemp

(

LogicalName varchar(500),

PhysicalName varchar(500),

type varchar(10),

FilegroupName varchar(200),

size int,

maxsize bigint

)

INSERT #restoretemp EXEC (@query)

SELECT @Data = LogicalName FROM #restoretemp WHERE type = 'D'

SELECT @Log = LogicalName FROM #restoretemp WHERE type = 'L'

PRINT @Data

PRINT @Log

TRUNCATE TABLE #restoretemp

DROP TABLE #restoretemp

IF @File > 0

BEGIN

SET @query = 'RESTORE DATABASE ' + @TestDB2 + ' FROM DISK = ' + QUOTENAME(@BackupFile, '''') +

' WITH MOVE ' + QUOTENAME(@Data, '''') + ' TO ' + QUOTENAME(@DataFile, '''') + ', MOVE ' +

QUOTENAME(@Log, '''') + ' TO ' + QUOTENAME(@LogFile, '''') + ', FILE = ' + CONVERT(varchar, @File)

EXEC (@query)

END





View 1 Replies View Related

SQL Login Control Works From Dev Environment But When I Deploy The Website It Errors

Mar 27, 2006

An error has occurred while establishing a connection to the server.  When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)
 
-This is the error that is being generated once the site is hosted. If I run my code from my machine and access my remote sql 2005 database it works fine. It is when I try to login from my URL directly that I get the error. Any thoughts?
Thanks.
 
John

View 2 Replies View Related

Sharing Website(asp.net) Mdf Database On A Network In Multiuser Environment

Sep 19, 2007

simple question:

I have create a web site in visual studio with form user authentication. Thats create a mdf database in this path:

c:inetpubwwwrootwebsiteapp_dataaspnetdb.mdf
thats use Sql server

i would like to access this database with c# form application on a network with multi-user.
if i change the drive letter of the path above to f: (mapped drive) or \serverc, i got this message


System.Data.SqlClient.SqlException: The file "h:InetpubwwwrootwebsiteApp_DataASPNETDB.MDF" is on a network path that is not supported for database files.
An attempt to attach an auto-named database for file h:InetpubwwwrootwebsiteApp_DataASPNETDB.MDF failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.

i read that sql server did not support database sharing with mapped drive or unc share.

So how can i share website database with c# application???

Note: of course if i develop the website and the application on the same machine its work. I want this application run on several work station accessing all the same mdf database with his web user member.

View 7 Replies View Related

Sql Report Works Fine On Internal Servers - Hosed On External Servers - Need Some Help

Nov 21, 2007

I have a report that was designed using SQL Reporting Services that sits on a SQL reporting server. It's nothing too exciting, it is essentially a three page application with legal jumbo on pages 2 and 3 and applicant data in fields on page 1.

We use rectangles to force page breaks to page 2 and to page 3.

When running the report on the report server, it shows and prints fine.

When running the report from the QA website internally, it shows and prints just fine.

When running the report from the production website from a machine internally, it shows and prints just fine.

When running the report from outside of the company network, the report is jacked. It obliterates large chunks of text, crams text together, and creates blank pages.

I need help in determining where I even begin with trouble shooting this!

View 1 Replies View Related

Login Fails For Network SQL Server But Works For Localhost

Jul 18, 2005

I have an ASP.NET webform:This connection works:   "Server=localhost;uid=sa;pwd=;database=pubs"but this connection DOES NOT work:  "Server=dnrsqlt1;uid=sa;pwd=;database=pubs"dnrsqlt1 is a sql server my network.Do I have to do something to users ASPNET or  IUSER_Machinename on the remote machine

View 6 Replies View Related

Recovery :: Cannot Create Listener For High Availability Group Of AlwaysOn On Cluster Environment

May 27, 2015

I have getting issues when i am creating listener for always On . Error shown as below

Can not bring  the Windows server fail over cluster (WSFC) resources online. (Error Code 5942). The WSFC service may not be running or may not be accessible in its currents states, or the WSFC resources may not be in a state that could accept the request.

For information about this error code see "system error code" in windows development documentation 

The attempt to create network name and IP address for the listener is failed. The WSFC service may not be running or may not be accessible in its currents states or the value provide for the network name and IP address may be incorrect. Check the state of the WSFC cluster and validate network name and IP address with network administrator. (Microsoft SQL Server error 41066) ...

View 2 Replies View Related

Recovery :: Need A Third Network / Subnet For AG Listener

May 20, 2015

I have setup two node SQL 2012 Always ON cluster with file share witness. I have configured two networks, one for heartbeat and second for public.My question is do I need a third network/ subnet for AG Listener? Which network does the SQL user databases in the AG group use to replicate between the replica servers?When I do the SQL backups, do I need to backup primary and secondary replicas or just backup the databases in the secondary replica?

View 4 Replies View Related

Recovery :: Differential Backup Much Larger Than Database / Full Backup

Nov 16, 2015

I have a database that is just over 1.5GB and the Full backup that is 13GB not sure how this is since we have compression on for full backups and my other full backups are much smaller than there respective databases...Now my full backup is taken every Sunday night and the differentials are taken every 6 hours after the full backup. Now I have been thrown into this DBA role with little to no experience just what I have picked up and read. So my understanding of backups are limited but what I think I understand is that we take a full backup and the differential only captures what changes in the database so my question is why is my database 1.5GB but my differential is 15.4GB? I have others database that are on the same instance and don't seem to have this problem. I also just noticed that we do not rebuild the index before a full backup like we do on other instances...

View 13 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved