SQL Server 2008 :: Locating MDF / Log And Backup Files?
Aug 21, 2015
I am trying to locate the mdf,log and bak files but I am not seeing folder (MSSQL10_50.MSSQLSERVER) the only folder I see there are 90,100,110 and Report builder.
I located the properties of one of the databases below right clicking the properties on SSMS, but when I physically go there, I don't find that folder.
C:Program Files (x86)Microsoft SQL ServerMSSQL10_50.MSSQLSERVERMSSQLDATA
I gave myself all the permissions under the security tab fro the folder Microsoft SQL Server but still nothing.
View 9 Replies
ADVERTISEMENT
Mar 13, 2015
I've written a custom script to delete backup files from location. But unable to modify now to count the number of files are deleted. How to modify the script...
/* Script to delete older than N days backup from a specific directory */
USE [db_admin]
GO
IF OBJECT_ID('usp_DeleteBackup', 'P') IS NOT NULL
DROP PROC usp_DeleteBackup
GO
[Code] .....
View 2 Replies
View Related
Feb 11, 2015
I am trying to create a job using power-shell script to move the backup files from one folder to another. I am using Ola Hallengren script for backups. Ola hallengren created a common backup folder with sub-folders for databases and even more sub folders for Full and Log backups. My goal is to move full backups, which are older than a month and save them in a different drive along with the same folder structure. I was able to move the first set of backups without any problem, but I can't move anymore files and keep getting this error even when I try to overwrite the previous file with the force statement:
Move-Item : Cannot create a file when that file already exists.
At line:5 char:9
+ Move-Item $i.FullName C:Test -force
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: (C:BackupVALIDATIONgcommon:DirectoryInfo) [Move-Item], I
+ FullyQualifiedErrorId : MoveDirectoryItemIOError,Microsoft.PowerShell.Commands.MoveItemCommand
Here's the script that I used to move the first set of files:
foreach ($i in Get-ChildItem C:BackupVALIDATION)
{
if ($i.CreationTime -lt ($(Get-Date).AddMonths(-1)))
{
Move-Item $i.FullName C:Test
}
}
View 0 Replies
View Related
Jul 29, 2015
I want to delete all backup files from a folder older than a specific date. But if I use the beklow query, I need to pass how many days of older backup files I need to delete whereas in my case, I dont know how many days/month/syears of old backup files are there in the backup folder.
EXEC xp_cmdshell 'FORFILES /p c:BACKUP /s /m *.sql /d -30 /c "CMD /C del /Q /F @FILE"'
View 2 Replies
View Related
Feb 24, 2015
I have the need to delete old backup files via TSQL job. Found this solution online:
PushD "
emoteservershareDIFF" &&(
forfiles -m *DIFF*.sqb -d -1 -c "cmd /c del /q @path"
) & PopD
It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?
View 6 Replies
View Related
Feb 18, 2015
SQL Server 2008 r2 - 6 GB memory...I attempted a backup on a 500GB database but it was taking way too long. I checked the resources on the box and saw the CPU at 100%. I checked the SQL Server activity log and saw a hung query (user was not even logged on) that had multiple threads so I killed it and now the CPU utilization is back to normal.
Trouble is, now all of the threads in the activity monitor for the backup show 'suspended' and the backup appears to be not doing anything.
View 3 Replies
View Related
Feb 12, 2015
I have a table with lots of xml files in one column(more than 1000), like this
1. <content xmlns:xsi="http://www.w3.org/2001/XMLSchema...
2. <content xmlns:xsi="http://www.w3.org/2001/XMLSchema...
3. <content xmlns:xsi="http://www.w3.org/2001/XMLSchema...each is big
I want to query some values for all to see the duration but now I can only query one of them
declare @bp xml
select @bp=xml
from bloodpressureohneschema
;WITH XMLNAMESPACES('http://schemas.openehr.org/v1' as bp,'http://www.w3.org/2001/XMLSchema-instance' as xsi,'OBSERVATION' as type)
select * from(
select
m.c.value('(./bp:time/bp:value)[1]','date') as time,
m.c.value('(./bp:data/bp:items[1]/bp:value[1]/bp:magnitude)[1]','int') as value
from @bp.nodes('/bp:content/bp:data/bp:events') as m(c)
)m
is there somewhere I can make better?
View 5 Replies
View Related
Jun 18, 2015
SQL SERVER 2008R2
The number of files to retain for SQL Server Logs is set to 99. When I expand the SQL Server Logs node in SQL Server Mgt Studio it shows the current log file through Archive#49. The oldest archive file is dated 2015/05/26. If I select that archive in SQL Server Mgt Studio it shows me the details and entries of that archive file. Yet when I go to the directory on SQL Server for the log files there are only the 5 most recent files. I have searched for '.trc' files on the entire drive and have found no other files.
How can SQL Server Mgt Studio show archive files that have no corresponding archive file in the directory that is supposed to contain the log files?
View 1 Replies
View Related
Feb 4, 2015
I need to load the latest csv files from file server , The files are placed in a folder called -
Posted 02022015- --> csv files .
I am able to copy the csv files from filserver using bulk insert (manually) , giving the file location
I am having difficulty picking up the latest folder which is posted on the server and import it into database using a stored proc .
View 2 Replies
View Related
Feb 8, 2015
I need to use SSIS to connect to an FTP server. From there I need to download files to a local folder. I need to download only today's files and also on those files starting with Training or Recruitment. I have managed to set up tasks that copy all but I am having such a hard time writing a script using C# that will download using the filters.
View 2 Replies
View Related
Feb 26, 2015
I am developing the SSIS and stuck on copying specific files.
1. We receive CSV file to our drive on a daily basis.
2. The csv file name has the last 8 digits formatted with the yyyymmdd. For example, the file name might be abcdef_20150226.csv This means it will be our CSV file for today, Thursday, February 26, 2015.
3. There are a lot of files in this directory.
[URL]
What we would like to do:
Add the constraints (or expression) that will copy the files from this directory to another directory that have the date equivalent to Monday only. For example, the file abcdef_20150226.csv will not be copied because it is Thursday file. But the abcdec_20150223 will be copied to a new Directory because it is Monday.
View 1 Replies
View Related
Apr 30, 2015
I want to query a column of xml files in a table,
use mysql1
declare @bp xml
select @bp=xml
;WITH XMLNAMESPACES('http://schemas.openehr.org/v1' as bp,'http://www.w3.org/2001/XMLSchema-instance' as xsi,'OBSERVATION' as type)
select * from (
select
m.c.value('(./bp:data/bp:items[1]/bp:value[1]/bp:magnitude)[1]','int') as systolisch
from
BloodpressureMitSchema cross apply
@bp.nodes('/bp:content/bp:data/bp:events') as m(c))m
But with this "cross apply" I can only query all the values in one xml and repeat them. Is there something wrong at "declear"
View 2 Replies
View Related
Aug 18, 2015
I have a client that has POS software called Restaurant Pro Express (RPE) from www.pcamerica.com
Their old POS computer had a hardware failure, but I was able to attach the hard-drive to another computer and recover the data. RPE uses a MSSQL database system. However, my client doesn't seem to make backups very often - the last one is dated January 5, 2015.
I was able to copy the C:Program FilesMicrosoft SQL Server folder over which contained the instance as well as all the data files - and has up-to-date information. The instance in the recovered Microsoft SQL Server folder was called MSSQL.1. I installed the RPE software on their new computer, and it too now has an instance called MSSQL10_50.PCAMERICA. The new computer is using MSSQL 2008 R2, while I believe the old computer would have been using MSSQL 2005.
View 4 Replies
View Related
Sep 14, 2015
I want to to move all database log files from drive E to F .
There are more than 10 databases so I don’t wanna move them 1 by 1 .
At the moment I use detach – attach method .
-Detach db
-Move log file
-Attach db
How do I do this massively in one go ?
View 5 Replies
View Related
Jan 10, 2012
I am trying to reorganise the log files on a server, (long story short they are fragmented so I want to shrink and reset the initial size and growth) and I am unable to shrink them. When I run the following:
use test
DBCC SHRINKFILE(test_log, TRUNCATEONLY)
--or
use
DBCC SHRINKFILE(test_log,2, TRUNCATEONLY)
I get the following message:
Msg 8985, Level 16, State 1, Line 1
Could not locate file 'test_log' for database 'test' in sys.database_files. The file either does not exist, or was dropped.
I get this message for every database on the server. I got the logical name of the file using sp_helpfile and have checked it against sys.masterfiles, sys.database_files and sys.sysaltfiles, all match up and confirm the name 'test_log'.
I rebooted the server last night and was able to shrink the first couple of .ldf's I tried so I presumed it was fixed. This morning when I try again i get the sanme error, I don't see anything in the SQL server or system logs that indicates a change.
I am able to add new log files and remove log files, however if I add a new log file (test_log2) and then try and truncate that file I get the same error.
View 9 Replies
View Related
Mar 18, 2015
I am new to SQL and I haven't written any scripts in the past. I thought I would give it a go. Basically, I am trying to write a script that will check if a database has more than one log files, free the VLFs that belong to the secondary log files and then remove them. I created a database named rDb as this link suggests and followed the steps.
[URL] ....
It works. However, I want to have to run just 1 script that will do the entire job. This is what I have gotten so far and it doesn't work:
create table #tempsysdatabase(
File_id int,
file_guid varchar(50),
type_desc varchar (20),
data_space_id int,
name nvarchar (50),
state int,
[Code] ....
View 0 Replies
View Related
Sep 15, 2015
We have a large database with a small number of large tables in it (and a larger number of SMALLER tables), and it is a publisher for a transactional replication scenario. When I create a snapshot to initialize a new subscription, I notice with the larger tables that sometimes it generates multiple files in the snapshot folder, usually in multiples of 16, and numbers them like this:
MyTable_3#1.bcp
MyTable_3#2.bcp
...
MyTable_3#16.bcp
With other tables, I'll get just one LARGE snapshot file, named:
MyOtherTable_4.bcp
In the latter case, the file can be very large (most recent is 38GB).
In both cases, the subscription will eventually be initialized, but the smaller files will generate separate log entries every few minutes in the Replication Monitor, showing 'Bulk Copied data into 'MyTable' (34231221 rows)', whereas the larger table will generate only ONE log entry, showing 'Bulk coping data into table 'MyOtherTable', and it may take a couple of hours before there is anything else showing...except for an entry saying, 'The process is running and is waiting for a response from the server.'
My question is: what would be the difference between the two tables that would result in one generating MULTIPLE snapshot files, the other only a single, much larger one? The only difference I can see in the table definition is that the one generating multiple files has a clustered index, whereas the others do not.
View 0 Replies
View Related
Oct 14, 2015
Any good starting point to understand for a specific db, how many max VLFs are good to have so that it does not cause long startup or backup times?
Also, I need some calculation so that I can identify a best growth parameter I will setup for each database ?
I'm seeing the below msg in errorlog and curious to know the changes (right sizing/growth) to be done? As of now 100 MB of log file growth value is set (refer: [URL] ....)
Database BizTalkMsgBoxDb has more than 1000 virtual log files which is excessive. Too many virtual log files can cause long startup and backup times. Consider shrinking the log and using a different growth increment to reduce the number of virtual log files.
View 3 Replies
View Related
Jan 30, 2015
I have a backup that comes to me nightly in the format of XXXX_YY_MM_DD.bak where the date is incremented each night the previous night backup is deleted when the next days is added so in general . I have only one in that folder. I wanted to setup a restore to run nightly
RESTORE DATABASE [XXXData] FROM DISK = 'C:folderExtractedDataapp_XXX_backup_15_01_28.bak' --location of .bak file
WITH REPLACE
I would like to do something like
RESTORE DATABASE [XXXData] FROM DISK = 'C:folderExtractedDataapp_XXX_backup*.bak' --location of .bak file
WITH REPLACE
While I'm asking.. In case there is an older one left behind(which shouldn't happen) I saw something about "LATESTFULL" but think its a redgate command?
View 5 Replies
View Related
Mar 2, 2015
A job runs every morning at 3.00 am to back up a database. Many of the tables have triggers on to write updates, deletes and inserts to audit tables.A typical trigger looks like this.
USE [ProjectDB_Live]
GO
/****** Object: Trigger [dbo].[trgStakeHolders] Script Date: 03/02/2015 10:23:38 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
[code]....
I added a trigger to a table over the weekend - structured the same as the one above and, when the back up ran last night, it copied every row in the table into the audit table as if every row in the table had been updated.
View 4 Replies
View Related
Aug 5, 2015
SQL Transaction replication, specifically SQL backup and restore Transaction replication. So Scenario,
S1 = Primary Server 1
R1 = T - Replication Server 1
R2 = T - Replication Server 2
So we have S1 replicating to R1, and we want to build another subscriber which is R2.
Can I take the Replicated Database from R1, backup it up, then restore it to R2, and create the publication/subscription?
Will that work? if not, is there an easier way to avoid the snapshot? the reason i ask this is because we do have replication snapshot, but takes long. One of my Colleagues stated he tried this, however replication made duplicate rows on R2, which is why he had to use replication snapshot.
View 0 Replies
View Related
Sep 17, 2015
2008 R2 Instance
SQL Server Logs show that a full backup is occurring on each database at 10:30pm every night.
We have a Maintenance Plan the does a full backup on Friday at 8pm and another Plan that does a differential backup on every other day at 8pm.
There are no other Maintenance Plans or Agent Jobs.
Nothing related shows in the Windows Task Scheduler.
There are no related files that have a 10:30ish time stamp (including no backup files).
Where I should look next to find these backups and/or the script/process that is creating them?
View 5 Replies
View Related
Feb 9, 2015
I need to import multiple csv files and load into table and everytime new database has to be created .
I was able to create new databases using stored proc
How do i do a bulk insert for all the files at once to insert into tables .
i want to load all the files at once .
View 6 Replies
View Related
Apr 23, 2015
SET NOCOUNT ON
Declare @daysOld int,@deletedate nvarchar(19) ,@strDir varchar(250)
declare @cmd11 nvarchar(2000)
declare @mainBackupDir varchar(2000),
@result1 nvarchar(max);
[Code] ....
View 1 Replies
View Related
Jun 3, 2015
I wrote the below script to print all folders and files located in the share path. How to extend my script to mention by adding another column whether the file is a folder/file , sort of 0 or 1.
declare @chkdirectory1 varchar(4000) = 'shared_pathfolder';
declare @finalserver3 varchar(4000);
create table #tmp (directory_name varchar(4000))
SET @finalserver3 = '''"DIR ' + @chkdirectory1 + ' /B"''';
--select @finalserver3
--SELECT @finalServer
DECLARE @ExecCmd varchar(100)
--SELECT @ExecCmd = 'EXEC master.dbo.xp_cmdshell ' + char(50) + 'mkdir D:'+ CONVERT(varchar(8), getdate(), 112) + '' + char(50)
SET @ExecCmd = 'EXEC master.dbo.xp_cmdshell ' + @finalserver3
--SELECT @ExecCmd
exec(@ExecCmd)
drop table #tmp
View 0 Replies
View Related
Jun 11, 2015
I have few complex queries and I want to extract the output of results to all different dateformatted output files.
How to write the queries?
I know BCP is a solution but any other effective way to implement it?
View 2 Replies
View Related
Jul 17, 2015
I've been struggling with this issue,
1) Test--FolderName (This Test folder name should use as a database name for below sub folders)
a)Create--Sub Foldername
i)create.sql
b)Alter---Sub FolderName
i)Alter.sql
c)Insert---Sub FolderName
i)Insert.sql
[Code] .....
The scripts need to be run in order. So script one needs to run first folder in that sub folders after that next second folders etc..
Is there a way to create a bat file that automatically runs all these scripts, in order against, the databases they need to?
The databases that they need to run against have the name of the database at the beginning of the name of the folder.
View 0 Replies
View Related
Jul 19, 2015
Im trying to upload 1000 txt files into one table in SQL. I'm using the following query, to upload one txt file at a time:
bulk insert [dbo].AAA_2013_2015
from 'dataserverSQL Data FilesSQL_EMELIZFC x Bloque Detallada201308 Detalle FacturasFACT_BLOQ_AGO13 (4).txt'
with (firstrow = 2,
lastrow = ???,
fieldterminator = ';',
rowterminator = '0x0A')
I'm trying that the query skip the last row because gives me the following error:
Msg 4866, Level 16, State 1, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 17. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
know a command to skip the last row, something like lastrow= all-1...or something like that.
I also executed using MAXERRORS command...like this:
bulk insert [dbo].AAA_2013_2015
from 'dataserverSQL Data FilesSQL_EMELIZFC x Bloque Detallada201308 Detalle FacturasFACT_BLOQ_AGO13 (15).txt'
with (firstrow = 2,
fieldterminator = ';',
MAXERRORS = max_errors,
rowterminator = '0x0A')
does not recognize MAXERRORS command, also tried to put a number of error instead of max_errors.
View 0 Replies
View Related
Oct 22, 2015
There is a SQL Server 2008 R2 SP3 Clustered Instance that has Transactional Replication. It is by no means a large replication setup in terms of data/article count. SQL Server was recently patched to SP3 and is current on Windows 2008 R2 Patches.
When I added a new article to replication (via 2014 SSMS GUI) it seems to add everything correctly (replication tables/procs show the new article as part of the publication).
The Publication is set to allow the snapshot to generate for just new articles (setting immediate_sync & allow_anonymous to false).
When the snapshot agent is run, it runs without error and claims to have generated a snapshot of 1 article. However the snapshot folder only contains a folder for the instance (that does have the modified time of the snapshot agent execution) and none of the regular bcp/schema files.
The tables never make it to the subscribers and replication continues on without error for the existing articles. No agents produce any errors and running the snapshot agent w/ verbose output provides no errors or insight into any possible issues.
I have tried:
- dropping/re-adding the article in question.
- Setting up a new Snapshot Folder
- Validated all the settings and configurations
I'm hesitant to reinitialize a subscriber since I am not confident a snapshot can be generated. Also wondering if this is related to the SP3 Upgrade, every few months new articles are added to the publication and this is the first time since the upgrade to SP3 that it has been done.
View 0 Replies
View Related
Sep 23, 2014
I've an emergency requirement to copy Source server database backup files to destination Server through xcopy command. Backup job on source server runs daily, so once this job get completes all databases backups needs to be moved to destination server. But here the main concern is "the backup files on destination server shouldn't be overwritten, they should be placed separately as Source server job runs daily".
We've a command which overwrites backups on destination server. But we need to keep backups on destination at-least for 4 weeks (means : retention should be 4 Weeks).
View 5 Replies
View Related
Nov 3, 2010
We use Netbackup for our SQL servers to backup and restore databases. I would like the service account used by Netbackup to have as limited permissions as possible. The account should be able to backup and restore a db without being able to read any of the content. Right now the account jobs fail if the service account is not in the sysadmin role.
I removed the account from sysadmin and limited it to dbcreator and public but the job fail.
How to setup an account so that people who know the service account password can't log in with that account and read db information?
View 9 Replies
View Related
Mar 18, 2015
I am getting following error:
"The log in this backup set terminates at LSN 9566000024284900001, which is too early to apply to the database. A more recent log backup that includes LSN 986000002731000001 can be restored."
If I am missing to restore previous log backup file, how to find which LSN mapped to which log backup file (.trn file name)? Another word looking at LSN can we tell which log backup file is missing?
View 2 Replies
View Related
Jul 15, 2015
easy way to compress backup file. I use SQL Server 2008 R2
View 4 Replies
View Related