I have developed a db on a local machine (98) and would like to move it to another machine running nt 4 where it will function as a production db. I am mapped to see the other server (called "labnt"). The drive on "labnt" is partitioned, with the c drive holding programs, and the e drive designed to hold the data.
I would like to move the db (called SNB01) to the MSSQLData subdirectory on the e-drive. I have tried the following sproc to detach:
EXEC sp_detach_db 'SNB01', 'true'
And the following to reattach:
EXEC sp_attach_db @dbname = 'SNB01',
@filename1 = 'labnt
ewmssqldataSNB01_Data.mdf',
@filename2 = 'labnt
ewmssqldataSNB01_log.ldf'
The above path was given me by our network admin guy.
QA gives the following error:
Server: Msg 5105, Level 16, State 4, Line 1
Device activation error. The physical file name 'labnt
ewmssqldataSNB01_Data.mdf' may be incorrect.
MSSQLSERVER VER : 2000 Using - Attach and Detach Database to move the database from one Server to another server. I did detached the database from Production and attached my test server, everything OK
Questions about - syslogins & sysdevices data dictionary tables 1. When I select * from sysdevices on production it shows all the database name size,phyname etc. After attaching to my test server I cannot see these things.. I know because it different master database but, how do I fix this. 2. syslogins - is there way to move all the syslogins after attaching to test server.
Please let me know if you guys have done anything on this.
I've just created ASPNETDB database with ASP.NET Security. Now, I want to send this db to orther computer.First, I detached this db, then when I used attach database in that computer, there is an error :Error 602: Could not find row in sysindexes for database ID 8, object ID 1, indext ID 1. RUN DBCC CHECKTABLE on sysindexes. Please help me .Thank.
In SQL when I right click on Databases the option to use Attach/Detach is greyed out. I don't understand why. I'm running SQL 7 on Win2000 server and I have Win2k at sp2 and SQL at sp2.
Is it possible to detach a database on SQL Server 2000 and attach it to SQL Server 7 ? I know it's fine to do the other way around. Also, any adverse effects I should be aware of ?
I want to move a database instance (about 50 user databases) to another server, both are running SQL Server 2000. Which method is better, retore from backup files or detach/attach? Some papers said restore amybe cause incorrect login and password transfer. Is it true?
An original database (eg A.mdf) is imported/exported to a different server as named B.mdf. The detach of A.mdf was unitentionally not done. It seems that both the database were updated.
I created a db in ssms; I detached it and moved it into a subdirectory of a vb app. Now sql server says the database does not exist. If I recreate the database in smss, I can run an INSERT INTO using that db as the destination. However, this only works if I keep it in the mssql data directory. How can I move it to the apps subdirectory so that it can be deployed with the app and still work? BTW, I also tried creating the databases in VS2005 with same results.
This is the connection string (tried this and non-express version with same results):
What problems could be invoked by stopping SQL Server and copying a user databases data and log files and pasting these files over existing files on another machine.
Ex: MyDB on Server1 MyDB on Server2 Stop SQL Server Server1 and Server2 Copy Server1.MyDB files Over write Server2.MyDB files Start SQL Server Server1 and Server2
Make sense. This scenerio seems to work but wonder what side affects could result.
I am having a question on detach/attach database. As when we detach/attach database, we have to copy both of .mdf and .ldf files to the targeted filepaths, in this case, when the database is large, then it takes a pretty while to copy the physical files though. In this case, I dont really see the point to detach/attach the database? Instead, backup/restore may be faster? (also, as detach the database, the files has to be taken offline)
Just confused about the way we trasfer database files with detach/attach. I am looking forward to hearing from you shortly for any advices for that.
We are planning to change the location of the log files from my production server. We have setup the mirroring also for those databases. I know for the databases without mirroring job i could detach the database, copy the log file to the target location and attach the database with the new logfile location. But for the mirroring databases I don't have any idea how to do that. Could some body guide me thorugh the process on how to do this task? thank you very much for all your help!
We have a hardware problem and will be getting a new RAID HDD system.
I've asked the tech onsite to backup and detach the MSDB and then reattach.
Can this be done, I expect to have him detach the new MSDB when MSSQL is re-installed. Then replace that file with the previously detached version and run reattach from QA(?)
This might sound dumb, but BOL says you can use detach/attach to "MOVE" a database but can you detach and copy the files to a new server, leaving the files on the original and running the attach on the new and the old? (effectually a way to copy)? I know how to use data transformations/import,export was curious?
I have similar question like Tim asked but from a different angle. I wish I pose my question here too anybody can help answer this.
I tried to use detach and attach method to do backup of the user database (not master, maybe we can do master too) on another server, but it looks the login and permission credential would not go together. I come to see different kind of message I tried to login as non-sa user after I attatch the database on different machine. I would be denied for SQL timeout. It looks like loginand user dose not exist in new server because I did not move master database too. But even if I recreate the login when I tried to associate user with login I will see a message saying user or role already exist in database. If I go back to check the user it did not show up on enterprize manager on that database.
So my question would be 1). how to move login and user together with user database. 2) do we have to move master database where the login infor. reside 3). if so how to move master database from one server to another when other server already have master database exist?
I'm trying to see the differences between Backup/Restore and Attach/Detach. I backup and detach a database from SQL Server 2000 SP3 and then attach and restore it to SQL Server 2005 SP1.
The differences I noticed are:
1. The restored database has a much larger initial size (database size is same) for data and log. 2. The attached database has a last backup date 3. If the backup is restored over a database, the restored database is showing owned by the database owner of the database restored over but syslogins and sysusers do not match.
I don't understand why #1 happens.
Are there any other differences between Backup/Restore and Attach/Detach?
A database move was accomplished by detaching the database from the legacy server and attaching it to the new server. The servers ran NT 4, SP6a, SQL 7 SP2. The new server had the same name and IP as the old server.
However that process (detach attach) does not bring over the users (who had access to the database on the legacy systems) correctly.
Under databases, roles the users existed. Under security, logins they did not.
We were obliged to drop the roles and recreate the logins and assign new roles. In the case of an NT user group we were unable to drop it. We had to create a new one and add the users to that group under User Manager.
So far I have not found anything that deals with the user account/role problem.
We have to move a similar database tonight. Do any of you know of an alternative "move" that brings the user roles/logins across correctly?
Hi! I did: alter database mydb set single_user with rollback immediate; exec sp_detach_db @dbname='mydb', @keepfulltextindexfile='true';
then I tried to copy files to new location on other drives, same server but got >>Cannot copy <myfile>: Access is denied Make sure the disk is not full or write-protected and that the file is not currently in use<<
I also tried rename of file without success. I also tried with db service stoppet (not preferred) without success.
How to find out, which process locks the files? Best regards
If I detach a database on a server, then attach the database with asingle file only using the data file...will I lose any transactionsthat were in the original log file? There is a debate going on here atwork where some people think that during the detach, transactions inthe log file are saved to the data file.Thanks!
An attempt to attach an auto-named database for file C:WebApp_Dataaspnetdb.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share. My web.config file, the connection strings <connectionStrings> <add name="POAdatabaseConnectionString1" connectionString="Data Source=sss;Persist Security Info=True;Initial Catalog=POAdatabase.mdf;Integrated Security=SSPI" providerName="System.Data.SqlClient" /> </connectionStrings> I cann't really find the statement for attaching aspnetdb.mdf. I don't where to find it. Is there any other web.config file in the C drive? Thanks for help!
I have a DTS setup that copies a couple of databases from one server to another in order to have a hot backup server. When I execute the DTS package from the DTS portion of the Enterprise Manager tree it runs fine. When I schedule it, it fails with the following error:
Much of the data in the Comments field already has carriage return in it. When I use bcp out -c, it uses as a new record to figure out when the new row stats. However, the data in that column has carriage returns! Hence, when it detects the carriage return in the user data(comments) itself, it is pushing the remainder of that text to the 2nd line. Then, when I try to BCP IN, it is trying to push it into ID column!
The text file looks something like this:
ID COMMENTS -- --------- 1 This is a very long comment with a line return now. User hits return here. This should be part of the previous record but gets detected as a new row 2 Some Text 3 Some Text
So, in the BCP IN, it is trying to put 'This should be part of..' into the ID column and I am getting an error.
Is there any way I can get BCP to NOT recognize in user data as end of a record??
We've had RS running on an internal network for a while now and initially installed it with no SSL certificate. We recently decided that we would like to make some reports available through the web, and that we should secure the data with SSL. So, we registered and set up the subdomain, bought and installed the certificate, closed off all but port 443 on the box in question, and opened it up to the world. We tweaked a few options in the RS config files (after reading around):
RSReportServer.config: - SecureConnectionLevel changed from 0 to 2 (recommended) - this changed all but the help file links to https://. - UrlRoot changed from HTTP to HTTPS, and also changed from internal to external name (to match the SSL certificate)
RSWebApplication.config: - ReportServerUrl changed from HTTP to HTTPS and changed from internal to external name.
After this was changed, along with some IIS tweaks, we were able to get to the report manager through the web, and force SSL only. The problem right now is that we are intermittently getting "Underlying connection closed" errors (described here: http://support.microsoft.com/kb/915599). ; The odd thing is that we first get the Windows login prompt, wait about a minute then get the error (which is encapsulated by the Report Manager page style). After a quick browser refresh, the Report Manager appears almost instantly, with no 2nd request for a windows login.
My question: Are there any IIS or RS config settings I can look at, or any log file entries I should look for in order to determine the cause of this problem? My guess at this point is the error has to do with some sort of timeout, but to be honest, I have no clue.
I want to be able to intersect many tables. I am building my query from vb code in asp .net based on key fields entered in a search engine box.my query should look like this, which will return the rows that will have the values var, var_2, var_3 in any columns. All three must be in a row for it to be a hit. I cant get this to work in ms sql. I don't know if it supports this feature.select * from t where column1 Like '%var%' or column2 like '%var%' or column3 like '%var%'intersect select * from t where column1 Like '%var_2%' or column2 like '%var_2%' or column3 like '%var_2%'intersectselect * from t where column1 Like '%var_3%' or column2 like '%var_3%' or column3 like '%var_3%' I also googled around and found a where exists... But cant seem to figure out how to do multiple tables:select * from t where exists (select * from t where column1 Like '%var_3%' or column2 like '%var_3%' or column3 like '%var_3%') i would like to add multiple conditions to the where exists table. Could anyone please tell me what I am doing wrong.
Taking the advise from one of the postings I removed the setting for checkpoint log on checkpoint. I have set up a batch job that does a dbcc checktable (syslogs) and a dump transaction with no_log every 15 minutes. We are a development shop and do not need the transaction log, I would have liked to use the truncate log on checkpoint. I found that the transaction log seems to grow indefinitely by viewing DBCC sqlPerf and sp_spaceused if I do not first issue the checktable command. Why is this needed? This is the same problem I had with the truncate log on checkpoint option. Has anyone else encountered a similar problem? We have an application that does single row inserts multithreaded at a rate of 2500 rows a second. Without performing the dbcc checktable the log filled to over 80% in a 45 minute period. Running the stored procedure with the two commands the logs stays under 40%.
Trying to get my jobs to send mail when job fails. Should be easy but it's giving me headache
Had a whole slew of issues. Outlook is installed with a n outlook mail profile set up that can send mail in outlook. I can create a SendMail DTS and execute it to send mail
Email works in these scenarios 1. I create a DTS package in SQL Server with just SendMail with the same Profile "ABC" and click Execute and it sends 2. I can send using Outlook with the same profile "ABC" 3. I can run DTSRun with command prompt calliing the DTS package and it sends fine
However it wont send in these 3 scenarios (where I need it!):
1. I go to Operator, and put my Profile "ABC" in the Email Name, click Test and I get this error "Error 22022: SQLServerAgent Error: The SQLSErverAgent mails ession is not running; check the mail profile and/or the SQLServerAgent service startup account in the SQLSErverAGent Properties sQLServer Agent is running
2. If I add the DTS Package "EmailTest" to one of my jobs as a step to go to if the 1st step fails, nothing gets sent
3. In JObs -- Notificatioin, If I set the E-maill operator to my operator, nothing gets sent
I set the Mail Profile to the Mail Profile (in SQL Server AGent' properties) However when I click Test, I get this error: Error 0: Cannot load the DLL sqlmap70.dll, or one of the DLLs it references
I am using SQLServer Version 8.00.194, OS Windows 2003
I don't know if anyone can help me but I have two issues with SQL Server 2000 SP4 (version 8.00.2039 - Desktop Engine) running on W2K and W2K3. I'm also running SQLXML 3.0 (msxml2.dll version is 8.30.9530.0).
Is it me or is sp_xml_preparedocument a crippled fat dog that is blind?...not that I have anything against crippled fat dogs that are blind :)
In all the stored procs I have developed, I pass a text var as an input parameter and return an IStream to ADO (using an sqlxml provider) in COM+. All has been very well and fine...until the passed text parameter resembles a data object of any decent size.
The first error I was noticing was a "XML Parsing Error: not enough storage is available to complete this operation". Well, I thought I would debug logically in a step fashion and just prepare the doc first and then do a return and then do a return on the next segment of code to find out where the issue is. I was amazed to find that sp_xml_preparedocument is taking 7 seconds to load a simple 1MByte text input var and around three minutes to load a 7 MByte file.
I believe these long load times are causing issues with transaction timouts etc so I thought I would try to solve the speed issue with sp_xml_preparedocument and then see if the "XML Parsing Error" continues.
So, my first question is:
Should sp_xml_preparedocument take 7 seconds to load a 1MByte text variable and nearly three odd minutes to load a 7 MByte file? Surely there is something wrong somewhere?
I'm also running these tests on two machines - one is 2 GHz and the other is 2.4 GHz P4's.
SELECT T1.* FROM Cust_Table T1 INNER JOIN ( SELECT Family_Name FROM Cust_Table WHERE Cust_Name IN ('Billy', 'John') AND RowNum < 100 GROUP BY Family_Name HAVING COUNT(*) > 1 ) T2 ON T1.Family_Name= T2.Family_Name WHERE RowNum < 100
( This code above finds all the familys that contain either multiple billys and/or multiple Johns and displays all the duplicates ordered by the family_name. )
The problem is that what I want it to do is search through the whole table and find within each Family (Family_ID) who has both a sibling called Billy AND John (Cust_Name) wether they have multiple johns and multiple billys I don't mind as long as they have a minimum of 1 of each. I then want to just output all those examples only not anything else. An example of the table is bellow, I hope this helps. Thanks for your advice.
BEFORE Family_Name CUST_Name Bruce Billy Bruce John Bruce Mike Bruce Oli Smith Billy Smith Billy Harold John
AFTER Family_Name CUST_Name Bruce Billy Bruce John