I have performed several restores due to poor equipment(in the past where I used to work, now I have a brand new HP); this database has never crashed. I read an article in which the guy said a good DBA should be doing a test restore at least once a month. I see where that would be extremely important since i test my ups weekly, and my genset once a week, why not ensure that the data is good also.
I need to restore test DB from production backup but once it is restored I would need all the permissions of sql logins and windows AD account intact in test Db as it was before.
I have one maintenance plan for full backup to run at midnight daily, but somehow it runs another one at 11:40PM which I don't have plan for it. I can see it happened twice by opening job history. They all use same maintenance plan.
The only difference, I can see is in the message, one is "The job succeeded. The job was invoked by Schedule 112(Daily Backup.subplan_1)", the one I did not expect has message "The job succeeded.
The job was invoked by user sa". How to find this job that invoked by user sa and delete it? Again I can only see one job for full backup , but I can see it happened twice from view job history.
I administer about 100 databases. I back them up to a file on the server hard drive everynight. Once a month I would like to test restore the backups . Due to the huge number of databases now its almost impossible to manually test restore them one by one so I came up with an automated script to do it. I have a database called testrestore and I restore each backup file to it get the count on certain crucial tables, throw it in a different table for later comparison and replace the database with the next backup file. I need to run this script on production, do you think its okay to test restore 100 databases one after the other using the 'replace' parameter ? Can it cause any memeory issues ? Is there any other way to test restore such a huge number of databases ? Suggestions are welcome.
I am trying to copy a production db (26.5 gigs) with a 3 gig log from production to a test server. The Prod db name is EDD_Cat which resides on one logical drive for the data (.mdf) and another logical drive for the log (.ldf). The test server does not have the same physical raid allocation. The only way that I can get that much space is to spread the data across 3 logical drives. I have preallocated a database called EDD_CatT with the same total physical db size. I have not been successful in restoring from a sql backup device (copied from production) to the new test db. Here are my tsql statements and error:
Restore Database EDD_Catt from Iloc01bkp with File=2, Move 'EDD_Cat_dat' to 'D:Mssql7DataEDD_Cat.mdf', Replace, Move 'EDD_Cat_dat' to 'E:Mssql7DataEDD_Cat2.ndf', Replace, Move 'EDD_Cat_dat' to 'F:Mssql7DataEDD_Cat3.ndf', Replace, Move 'EDD_Cat_log' to 'G:Mssql7DataEDD_Log1.ldf', Replace
start db restore --------------------------- 2001-01-02 12:23:31.610
(1 row(s) affected)
Server: Msg 3257, Level 16, State 1, Line 0 There is insufficient free space on disk volume 'E:' to create the database. The database requires 20447232000 additional free bytes, while only 1732972544 bytes are available. Server: Msg 3013, Level 16, State 1, Line 0 Backup or restore operation terminating abnormally.
I also tried using EM but basically got the same type of error.
I could do this with SQL 6.5 as long as the db size was the same or larger.
Any advice/suggestions will be greatly appreciated. BOL and the manuals that I have seem to only give examples that have one file for the data and another for the log but I could not find one that gave an example of what I am trying to do.
Thanks much for your time Calvin Matsumoto - State of California
Can anybody give me an idea or a script which can be used to Restore a production Database to Test Database on another server. As I need to do this 3 days a week, I would like to make this automated.
Please advise me about restore a master database from production to test server.
The reason to do this because I need to test and evaluate some login in master database.
I tried to restore master database to test server, but I got some errors regarding about user databases are not exists in test server. I don't want to restore user databases, I only need master database for evaluate user login.
I am trying to imitate a DR situation where the primary db is down and I need to recover the secondary db on another server. They are a log shipping pair and so to imitate a DR, I remove the log shipping in the primary server maintenance plan. Then I go to the secondary server and disable the log shipping jobs there and attempt to do the following
RESTORE DATABASE database_name WITH RECOVERY
but I can't get exclusive use because the database is in use. But I don't see any other users... am I wrong in thinking that the log shipping was completely deleted? Anything I can do to force exclusive access?
I have two SQL Databases on separate servers, live and test. I have been asked to copy the data from the live system and put it into test. They are SQL Management Studio 2008 running on MS Server 2008R2.
Could a simple backup of the database, then copy that file to the test system and restore the database from that point work or it there more to it?
I have in production, the database maintenance plan that runs the nightly backups. I want to restore this to a completly different server I am setting up for testing. What is the best way to restore a backup on one server (that was done using the maintenance plan) to another server? I tried just mapping a drive and then forcing a restore, however, I got some error message about an ID or something. Maybe I need to do it through query analyzer and there is an easier way to force it? thanks!
I am trying to restore a file that was created using Database Maintanence Wizard on a server that is now totally crashed.
I want to use T-SQL to restore the physical file to a new name and location. Using the backup command, I get the error that I must use the with move option??
What are the required steps to restore from a physical file (with a bak extention)???
I do not know the original database/log file names. Any advice?
I am working towards automating the process of testing our backups. For the meantime, I do it all manually - I copy the backup files (full + transaction logs) to our test server and then run the restore script. Once database restored I run the DBCC CheckDB. The results of checkdb I manually upload to our Sharepoint portal as proof that the backup file is intact with no errors.
here are some ideas I have but have not yet tested:
Create a maintenance plan with each 3 jobs:
--> Powershell script to copy the files from Prod server to Test server - add this scrip to Job1 --> Powershell script to restore databases files - add this script to Job2 --> Run the DBCC in powershell (yet to find if possible in PS) - add this script to Job3
I would like to use seperate jobs as to get a report on the duration and status of each job
Would also like to get the results of the DBCC Checkdb as proof that no errors were found for upload to our Sharepoint portal. Dont know if possible via the job.
Howdy; I've tried this in the 'tools' area, but that didn't work too well. I suspect, I will have to generate a T-SQL code then schedule it as a job. Why I can't just drag and drop with basic desires, is beyond me, but THAT probably does exist.
anyway here is the problem [this server has many databases, on SQL 2000 sp2] 1. User only wants me to use Monday morning's full backup, which is good in that it doesn't include transaction logs. 2. Restore that data overtop/into Developement db. = good, no data to worry about damaging. 3. User does NOT want me to do this by hand, but schedule it.
ok, a. must do a RESTORE WITH FILELISTONLY from [?] what ?, master? and if I user the *.bak of the production, it has a coded date field in the name entry SO, I would, I guess, have to generate all sorts of wonderful code to find the date and build a file name. Why, because using the FROM DISK = 'F:MSSQLBACKUPDBPRODUCTION_yyyyddmm.BAK' is not going to work with a wild card. Can I do a file lookup using a 'PRODUCTION' prefix into a variable, then use that or should I look for latest file date [remember there are several database backups here], or ????
then. How does one schedule such a T-SQL. Do I save it to some text file, and invoke it using a job scheduler.
We are setting up a test lab environment with 100 machines. Â We want one master testing db that gets replicated to each to run scripted application tests nightly. Â
My goal is to minimize the amount of work to move this thing to each of the 100 test machines. Â I am wondering if we need to even have the sql local and invest in a monster db server with 100 copies of the db we restore and each test machine point to their own db on that server, or if I should use db mirroring or something to get the master test db to each of those machines instead.
Now that we have a good programming model in SSIS - the question is whether to write automated unit tests for your packages, and would it generally be a good idea for packages?
Also - if yes to write tests - then where to find more informations regarding How to accomplish that?
hi every one, i need to test SSIS pacakge which will import data from different database where record count is around 5 millions. iam planning to test it through c# code as well as manually also. SSIS source : consist of 7 tables SSIS destination :consist of 7 tables Using c# code iam trying to run ssis package through batch file. i am putting expected rowcount, column count in an excel file and comparing same with destination tables by writing query implementing ADO.Net concept. am i going right way ,can any one suggest best and productive way to test the ssis package . what are the other things i need to test it. do any one can add test cases to it.
S.No
Test Case
1
Verify all the tables have been imported.
2
Verify all the rows in each table have been imported.
3
Verify all the columns specified in source query for each table have been imported
4
Verify all the data has been received without any truncation for each column.
5
Verify the schema at source and destination
6
Verify the time taken /speed for data transfer
7
Fields truncated due to difference in length of the field at destination. Regards Arif shareef
I have a question that I hope someone can clear up for me. I have come across a number of different suggestions on DB maintenance, for example reindexing with the following script:
USE DatabaseName --Enter the name of the database you want to reindex
DECLARE @TableName varchar(255)
DECLARE TableCursor CURSOR FOR SELECT table_name FROM information_schema.tables WHERE table_type = 'base table'
OPEN TableCursor
FETCH NEXT FROM TableCursor INTO @TableName WHILE @@FETCH_STATUS = 0 BEGIN DBCC DBREINDEX(@TableName,' ',90) FETCH NEXT FROM TableCursor INTO @TableName END
CLOSE TableCursor
DEALLOCATE TableCursor
My question is, doesn't the maintenance plan have this functionality inherent in it when you create the maintenance jobs to reindex? Is there a benefit to scripting things out vs just using the maintenance plan wizard for this sort of thing and any of the items it covers? I came from an Oracle background where this was a no-brainer but I am a bit confused on the choices with SQL Server.
I am testing some maintenance tasks sql commands such as index rebuild, index reorg, update statistics and db integrity check on a SQL Server 2014 Database. This is a new non-production vendor database (DB Size 500 GBs, Log Size 25 GBs) which eventually will be created in production. Currently, it is in full recovery model and without log backups. The database has a whole lot of indexes. I am just trying to rebuild and reorganize all the indexes (that need it), in addition to trying to get an idea of how long these maintenance task will take and the space needed in the log file to complete these tasks/commands. I would like to execute these tasks manually (the first time) to gather the duration and space required information. Eventually, I would probably schedule a weekly job to perform this maintenance.
I ran the index rebuild task on the database and noticed that the log file grew by over 50 GBs. I killed the process and truncated and shrunk the log file back down.
1. Does the index rebuild, index reorg, update statistics and db integrity check commands all use the log file?
2. Does Indexs Reorg have less impact on log file then Index Rebuild?
3. Should a truncate log and shrink log file be performed after these maintenance commands?
4. Should a full database backup be performed after these maintenance commands? Or before the maintenance commands?
I have read and understand that shrinking is not good for the database (could lead to more fragmentation and more data file growth when data is added) and I know about rebuilding indexes when fragmentation is GT 30% and reorganizing indexes when fragmentation is GT 5% and LE 30%.
Since this is a non-production database maybe I should set the recovery model to simple, run the maintenance commands and leave the database in simple recovery model unless the vendor needs it in full recovery model for some unknown reason.
5. With the simple recovery model the log file should be reused in a circular manner and not grow during these maintenance tasks. Is this correct?
While migrating Report services in SQL Server 2005 to 2014, I am trying to restore the Encryption Key in RS Configuration Manager in2014. But I cannot click the 'Restore' button in RS Configuration Manager. So if I should be grant more right to do so or any other action?
In Windows Server 2012. How do I do a System Restore to a previous restore point?I need to install the 64 bit and 32 bit Oracle Client Install for connections in SSIS and to create Oracle Linked Servers.
If you make a mistake it is not fun removing it. Sometimes it corrupts the machine and it is difficult to uninstall since there is not an Oracle Universal installer for Oracle 11g.If you install the 32 bit before the 64 you mess up the machine.how to create a restore point.
I am looking for a SQL Backup/Restore tools which can restore multiple environments. Here is high level requirements.
1. We have 4 DBs, range from 1 TB - 1.5 TB Each Database. When we restore to QA, DEV, or Staging, we usually restore 4 of them. 2. I am looking for the speed to complete restoring between 1 - 2 hours for 4 DBs.
I am evaluating the Dephix Software but the setup is very complex and its given us a lot of issues with Windows Authentions, and failure in the middle of the backup. I used Guess Software many years ago but can't find it on the web site any more. Speed is very important for us mean complete restoring as fast as possible. We are on SQL 2012 and SQL 2008 R2.We are currently using NETAPP Technology and I have Redgate Backup Tool but I am mainly looking for fast Restore Process.
Would some one please help me with the syntex on how to run "restore filelistonly" or restore verifyonly" on a SQL backup which has multiple filesets?? My backups locations are as follow: RESTORE VERIFYONLY From disk = 'E:syndicated_databank__bkup_01.bak', 'E:syndicated_databank__bkup_02.bak', €˜E:syndicated_databank__bkup_03.bak€™, €˜E:syndicated_databank__bkup_04.bak€™, €˜E:syndicated_databank__bkup_05.bak€™
I tried to do a restore with the above, I got error The label 'E' has already been declared. Label names must be unique within a query batch or stored procedure.
I have seen this before. A 2000 restore fails, leaving the database thinking it is being restored but the restore job failed and errors when it is restarted. EM is clueless. I believe there is a proc to reset some flag. Can you share it with me???
I am trying to test to see if my code is returning rows. If it's not I want to display an error saying "Nothing Found" Please review and give me your thoughts on the best way to accomplish this. 1 Protected Sub btnLogin_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles btnLogin.Click 2 Dim SubEmail As String 3 Dim SubPassword As String 4 SubEmail = txtNewsEmails.Text 5 SubPassword = txtNewsPassword.Text 6 Session("NewsEmail") = SubEmail 7 Session("NewsPassword") = SubPassword 8 Dim sID As Integer 9 10 Dim cs As String = ConfigurationManager.ConnectionStrings("csTiPs3").ConnectionString 11 Dim cn As SqlClient.SqlConnection = New SqlClient.SqlConnection(cs) 12 cn.Open() 13 Dim selectString As String = "Select SubscriberID from NewsletterSubscribers WHERE SubscriberEmail = '" + SubEmail + "' AND SubscriberPassword = '" + SubPassword + "'" 14 15 Dim cmd As SqlClient.SqlCommand = New SqlCommand(selectString, cn) 16 17 Dim reader As SqlDataReader 18 reader = cmd.ExecuteReader 19 While reader.Read() 20 sID = reader("SubscriberID") 21 End While 22 Session("SubscriberID") = sID 23 24 reader.Close() 25 26 rtsNewsletters.SelectedIndex = 1 27 rtsNewsletters.FindTabByText("Subscribe").Enabled = True 28 rmpNewsletters.SelectedIndex = 1 29 30 End Sub
I recently lost my job and wanted to do some test development to keep my skills up to date. The problem is I don't have access to any data sources. Is there such a thing around? I tried installing microsoft's trial of SQL 2005 but can't get it to run on my laptop and SQL 2000 trial no longer exists. I simply need to create a SQL db/tables etc. Thanks in advance.
Hi! I am currently utilizing the checksum function to generate a hash that I later compare to detect changes in a row. CHECKSUM(field1, field2, field3, field4) Now I'd like to use the HashBytes function instead over the same fields. But the HashBytes function accepts only one data value. What is the most effective and reliable way of getting an MD5 over several fields? Thanks
Hi. I have an application where I allow users to type in their SQL queries. Before I store those queries I have to make sure that they are correct, both syntax-wise and data type-wise. For that I execute the query against the database and trap any errors that may be returned and that's how I judge if the query was OK. In order to keep this test as quick as possible, I tried to add a WHERE clause to it like: WHERE 1=2, so no results are returned. But then I discovered that the addition keeps errors from happening if they are of ata type nature.For instance "select orderid + 'test' from orders where 1=2", run against the Northwind database, returns no errors, while OrderID is numeric and U'm adding a string to it! Next, I tried to return only one row: "select top 1 orderid + 'test' from orders". This time the error is thrown, however the query still takes a looong time when run on a huge table. I don't kknow why that is, but it seems that the engine runs the query for the entire table and then gets the first row! Does anyone have an idea ehat's happening or have a better suggestion on how I can perform my test without killing the database? Thanks.