SQL 2012 :: Deleting Already Duplicated Filestream Files?

Feb 8, 2015

I have three FileStreams (FS1 on F drive, FS2 on H drive, FS3 on E drive) belonging to the same FileStream group of one particular database (DB) which is in Simple recovery mode in the SQL Server 2012.

FS1 contains huge number of files due to which F drive is completely full.

So, I am trying to move some of the extra files from one FileStream (FS1 on F drive) to another FileStreams (FS2 on H drive and FS3 on E drive) using command:

dbcc shrinkfile('FS1', emptyfile)

Then, I take the Full and Differential backup of the database and issue the CheckPoint and try to delete the already duplicated files from the Filestream FS1 to get some space in the F drive using command:

sp_filestream_force_garbage_collection @dbname = 'DB' , @filename = 'FS1'

But still no files get deleted and I receive the output as such:

file_name num_collected_items num_marked_for_collection_items num_unprocessed_items last_collected_lsn
DB_FS1 0 0 0 25000001749500000

how to delete these already duplicated files.

View 0 Replies


ADVERTISEMENT

Deleting Duplicated Rows

May 23, 2004

Hi,
I have a table named "std_attn", where, by some bad coding, lots of duplicated rows have been created. And the table don't have any PK. So Now tell me the way to remove the duplicaies..................


thnx

View 14 Replies View Related

SQL 2012 :: Database With Filestream Enabled?

Jul 16, 2015

We have a server with a database with filestream enabled. The filestream data is in a filegroup with three files spread across 3 LUNs F:, G:, and H: each with a capacity of 1.8 TB.

The file stream containers in those three LUNs reference the same column in the same table.

The F: Drive has only 64 GB free space left. The H: However has around 700 GB free.

We are looking to move some filestream content from the container in F: to the container in H:.

View 2 Replies View Related

SQL 2012 :: Moving Filestream To Another Database?

Oct 26, 2015

Our development organization has created an application employing FILESTREAM, which through the pilot has been incorporated into a schema within our Data Warehouse Staging database. Going to production, the development and BI teams have determined that they want it separated out into a separate database, and they'd like to separate it in the current pilot environment (DEV).

How can I best move (or at least copy) the existing FILESTREAM data from the current database into a new one?

View 0 Replies View Related

SQL 2012 :: Filestream To Store Office Documents?

Jun 26, 2014

I have been asked to look into using Filestream for centralising MS Office documents (mostly excel).I am worried about the "user interface" aspect.I read that there are "this and that" APIs to read/write data to the filestream but surely one would need to write a specific interface to Word/Excel... which feels like far too hard.I am a great admirer of SQL Server but is it the right tool for document management?

We use SQL Server 2012 and have offices round the world with various internet connection quality.Our main aim is to stop the current "spreadsheet nightmare" so common with Excel.

View 9 Replies View Related

SQL 2012 :: FILESTREAM DATA File Is Corrupted

Jul 21, 2014

our sql server databases are on a window cluster server.how to correct this issue or why would this happen? FILESTREAM data container 'M:TessituraDB DataDocuments' is corrupted. Database cannot recover.

View 0 Replies View Related

SQL Server 2012 :: Create Filestream For Filegroup For A DB?

Dec 5, 2014

i have a DB in SQL Server 2008. and i wants to store file in DB.

how can i create filestream for filegroup for a DB in SQL Server 2008.

View 1 Replies View Related

SQL 2012 :: Piecemeal Restore With Filestream Database?

Sep 9, 2015

We have one database with Filestream enabled. There is one table "dbo.files" which uses Filestream.

We created a filestream filegroup Filegroup1 and added 3 data containers to it. (3 filestream data containers within the same filegroup.)

We have three LUNs F:, G:, H: each with a capacity of 2TB (That is the limitation). F: and G: are almost full. So, I restricted their growth so inserts do not happen into these data containers. Inserts are now going into H: drive which has lots of free space. Our application code prevents any sort of deletes or updates to this table. So data in the growth restricted containers will never change.

Now the database is around 6 TB in size and backups is a challenge. We are contemplating on migrating storage to netAPP and use their snapmanager console which is much faster.

However, until then, we need a solution with native SQL backups. We tried partial backups and piecemeal restore.

WE tried this on a test server :

1) Partial backup only the read-only data containers first, (F: and G:) (The plan is to back these up just once a month as this data never changes).

2) Partial backup the primary filegroup plus the third data container in the Filestream filegroup which is subject to inserts (H:)

While restoring, we tried the online restore, First, I restored the backup obtained from step 2 above with recovery option. Then I restored the backup obtained from step 1 with recovery. I see that the database was brought online. However, when I try to query the dbo.files table, I get an error stating that some files of the filestream filegroup are offline.

View 0 Replies View Related

SQL 2012 :: Does Mount Points Support Filestream Feature

Apr 9, 2014

Does sql server 2012 mount points support FileStream feature?

View 1 Replies View Related

SQL 2012 :: Change Properties Of A Table Which Have Filestream When Column Added To It

Dec 6, 2014

I store files in db in sql server 2008 by filestream. But when a column would be added to table which have filestream, properties of table would be changed. by every things change on table, retrieve files will faced to error. but store process work probably.

and filestream filegroup at following address will be empty. why?

Right click on table --> properties --> storage --> filestream filegroup

View 0 Replies View Related

SQL Server 2012 :: Get First Occurrence Value When Value Is Duplicated

Oct 16, 2015

generate output as specified below. i need to get 1st value when record is duplicated in hierarchyval column

IF OBJECT_ID('tempdb..#test') IS NOT NULL
drop table #test
create table #test
(
hierarchyid int
,hierarchyval int

[code]...

View 3 Replies View Related

SQL 2012 :: Link To Download File From Filestream Without Access To Save Or Navigate In Share Folders?

Aug 19, 2014

Get a filestream download link with only access to read and with folder navigation

I need a link with the path to get the file stream blob, that path could be used to download a document using any windows app like windows explorer, etc. the requirement is that path does not allow customer to navigate in filesstream share folders or see other files and only can read the file of the path,

Checking :

[file_stream].GetFileNamespacePath(2)

Allow you navigate in folders.

NON_TRANSACTED_ACCESS read_only, resolve the requirement to disable the save in file table, but allow you navigate and see other files.

View 0 Replies View Related

Deleting TRN Files

May 26, 2006

I have a database and I see that I have a lot of TRN files behind it taking up more than 82GB disk space.I have a TRN file from Jan until today. I plan to delete every one of them until April to recover 59 GB of disk space. Would that be OK?

View 4 Replies View Related

Deleting Files In A Directory More Than 200

Dec 1, 2003

http://forums.databasejournal.com/showthread.php?s=&threadid=29895

View 2 Replies View Related

Maintenance - Not Deleting Old Files

Apr 3, 2006

We have our Maintenance routines set up to delete files older that one week, but it is not working...

...Consequently, we forget to go out and remove the files manually, thus we lose backups due to the drive being full and have to manually remove files, then backup the databases and start all over again.

Has anybody had this problem and can you tell me where to look to try and figure out what the problem is ??

Thanks in advance,
Nancy

View 6 Replies View Related

Deleting Older Files

Aug 23, 2004

I have a backup job that has failed.

The database size is 20.6 GB and the Transaction logs are 135MB

The amount of disk space I have left is 3.65MB. Which I know is not going to work.

However on the maintenance plan it is suppose to remove files older than 1 day.

I am wondering if a job works like this:

Step 1 create backup file
Step 2 Create Transaction Log Back up
Step 3 Delete old backup file
Step 4 Delete old Transaction log back up

Which tell me I would need to have double amount of disk space to accommodate 2 20 GB backup file and 2 135MB Transaction log file.

Is this correct??

Also is there a way that I can have step 3,4 done first.

Lystra

View 5 Replies View Related

Deleting Old Backups/trn Files.

Nov 21, 2005

i have a maintenance plan running on my database, in which I told the wizard, on creation, to "remove files older than 4 week" and yet it doesn't seem to be doing so, as on checking this morning, diskspace was getting low, due to over 300gb of backups and trn' dating back to september.

Anyone have ny problems with maintenance plans not cleaning up when told?

a

View 4 Replies View Related

Deleting Backup Files Question

Aug 8, 2000

I have created some backups where expiration dates or days to be
retained were not specified.

Using EM, how do I find the old backups.

Using EM, how do I delete the old backups (hoping this will
clean up the MSDB tables and physically delete the files
from disk).

Thanks!!!!

View 1 Replies View Related

Deleting Files In A Directory More Than 200MB

Dec 1, 2003

Hi,
I have posted a request regarding deleting the file more than 1 month old that we have discussed in bleow URL.

http://forums.databasejournal.com/showthread.php?s=&threadid=29895

My new request is I have enabled C2 Audit mode and wanted to delete old files that is more than 200MB.
Could you Please give me a query ?.
Thanks,
Ravi

View 2 Replies View Related

Backup Jobs Not Deleting Old Files!

Sep 12, 2007

Such a simple task. Not doing as it should!!

This Maintenance Cleanup Task is set to delete all BAK and TRN files it's made (in seperate maintenance plans) in given path, with the given file extension, delete files based on the age of the file at task run time. Delete files older that 4 days.

The files are now backing up for months and months. I'm not going to take care of this. I've got a computer to do this for me, every time it runs its jobs, every 4 hours and once overnight.

This is ignoring commands and refusing to do as it's told. I've checked the settings in here over and over. It's so simple - what could be wrong? I've checked the path, the file age, the extensions... The disks are getting full!

Has anyone seen anything like this?

View 17 Replies View Related

Deleting Files With The FTP Client On A UNIX Box

May 2, 2008



I get the following error when I try to delete files using the FTP client in a SSIS package. This is the error I get.

Error: 0xC002918E at FTP Task, FTP Task: Unable to delete remote files using "FTP Connection Manager".

Task failed: FTP Task


This is a unix server. I'm able to delete the files using other FTP clients but the FTP client in the SSIS package cannot delete the files. I read on many places on the interent that this is a known MS bug. Let me know if there is some sort of work around for this. I'm using SQL servewr 2005 SSIS packages to accomplish this task..

Thanks

View 3 Replies View Related

Deleting Backup Files Older Than 2 Days

Jun 23, 2008

Friends -
I am looking for a windows script (bat file) to delete backup files which are older than 2 days.

Please provide scripts on this.

Appreciate your support

Cheers :)
Satish

View 2 Replies View Related

2005 Maint Plans .txt Files Not Deleting?

Aug 16, 2007

I am running 2005 SP2 (It is the SP2 refresh as I downloaded the SP2 on 3/15/07) and I can't get the .txt maint plans to delete. The job completes successfully; however, the old .txt files are still there. I found this technote: http://support.microsoft.com/kb/938085
although I don't receive the error message in the KB. The report does have "New Component Output" as the 1st line in the report. I still don't see how deleting the 1st line of an already run job (the report) will affect a new run of the job.

Also, I checked the version of the Microsoft.SqlServer.MaintenancePlanTasks.dll and it is at 9.00.3043.00 (which is the SP2 Refresh) in KB 933508 ( http://support.microsoft.com/kb/933508 ).

I have several SQL/2005 servers and they all have this problem. Any help would be appreicated...

Thanks,
DeWayne

View 3 Replies View Related

Deleting Extra Tempdb Log And Data Files

Jul 20, 2005

We had someone create an extra data file and log file for tempdb. Sowe currently have two data files and two log files. Is it possible todelete the newly created data and log files? If I just delete thephysical files, I assume they'll get created as soon as SQL Servergets started back up. Any help would be great, since a single dataand log file for tempdb is my goal.Thanks much.sean

View 3 Replies View Related

Deleting Files Using SSIS Scripting Object

Jun 5, 2006

I am utlizing a scripting object in my ssis to combine two text files into one final file, and then I want to delete the original files. To do this I am utilizing the FileSystemInfo namespace and associating the file names, then utilizing the DELETE functionality.

The creation of the final file works perfectly...unfortunately, my base files do not delete, and I do not get a failure message or indictator.

Here is my code:


' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.

Imports System
Imports System.Data
Imports System.Math
Imports System.IO
Imports System.IO.File
Imports System.IO.FileSystemInfo
Imports Microsoft.SqlServer.Dts.Runtime

Public Class ScriptMain


' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.

Public Sub Main()

Dim strCurrentMonth As String
Dim strCurrentYear As String
Dim strWriteFileName As String
Dim strReadHeaderFileName As String
Dim strReadBodyFileName As String

'Utilizing a case statement, determine the monthname & year and set the appropriate variables
Select Case Month(Now())
Case 1
strCurrentMonth = "January"
Case 2
strCurrentMonth = "February"
Case 3
strCurrentMonth = "March"
Case 4
strCurrentMonth = "April"
Case 5
strCurrentMonth = "May"
Case 6
strCurrentMonth = "June"
Case 7
strCurrentMonth = "July"
Case 8
strCurrentMonth = "August"
Case 9
strCurrentMonth = "September"
Case 10
strCurrentMonth = "October"
Case 11
strCurrentMonth = "November"
Case 12
strCurrentMonth = "December"
End Select

strCurrentYear = Year(Now()).ToString

'Set variables with file names (reader files and write file) for ease in readability and to
'set final (write file) with appropriate nameing convention utilized by Matria HealthCare.

strWriteFileName = "\CUPSRV05SHAREDISPublicData ExportMatriaFiles TO Matriacup_ref_cup_" & strCurrentMonth & strCurrentYear & "_ftp_ReferralFormat.txt"

strReadHeaderFileName = "\CUPSRV05SHAREDISPublicData ExportMatriaFiles TO MatriaMatria_Referral_Control.txt"

strReadBodyFileName = "\CUPSRV05SHAREDISPublicData ExportMatriaFiles TO MatriaMatria_Referral.txt"

'create stream reader/writer objects

Dim sr As New StreamReader(strReadHeaderFileName)
Dim sr2 As New StreamReader(strReadBodyFileName)
Dim sw As New StreamWriter(strWriteFileName)

'feed the header record into the final file

Do Until sr.Peek = -1
'write the header record
sw.WriteLine(sr.ReadLine)
Loop

'close the read stream for the header record file
sr.Close()

'Feed the body records into the final file
Do Until sr2.Peek = -1
'write all base records
sw.WriteLine(sr2.ReadLine)
Loop

'close the read stream for the body records
sr2.Close()

'close the write stream for the final distribution file
sw.Close()

'dispose of all stream objects
sr.Dispose()
sr2.Dispose()
sw.Dispose()

Dim EligBaseFile As New FileInfo("strReadBodyFileName")
Dim EligHeaderFile As New FileInfo("strReadHeaderFileName")

EligBaseFile.Delete() <--These do not delete or through an error
EligHeaderFile.Delete()

'final statement for SSIS package to determine script result

Dts.TaskResult = Dts.Results.Success

End Sub

End Class

I would appreciate any light you can shed on this. Thanks!

View 5 Replies View Related

Deleting Backup Files Older Than 5 Days Old.

Mar 14, 2007

I am using the backup task and backing up a database but want to delete all backup files older than 5 days old. I am using the file task for this and have built the path in a variable but am trying to use a wildcard for the time. I am getting illegal character in path. How can I go about this.

I currently have E:MSSQL.1MSSQLBackupdatabasename_backup_20070309*.bak in my input variable and am trying to delete the file databasename_backup_200703091532.bak

View 4 Replies View Related

SQL 2012 :: FOR FILES Command To Delete Old Backup Files On Remote Server?

Feb 24, 2015

I have the need to delete old backup files via TSQL job. Found this solution online:

PushD "
emoteservershareDIFF" &&(
forfiles -m *DIFF*.sqb -d -1 -c "cmd /c del /q @path"
) & PopD

It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?

View 6 Replies View Related

SQL 2005 Maintenance Cleanup Task Is Not Deleting Files On Remote File Share

Jan 16, 2008

I have the following issue with Maintenance plan backups that work for BAK DIF and TRN to a remote server share.
When I try and remove the old files with a clean up task I get an error and the files don't get deleted.


The version is as follows
Microsoft SQL Server 2005 - 9.00.3054.00 (X64) Mar 23 2007 18:41:50
Copyright (c) 1988-2005 Microsoft Corporation Standard Edition (64-bit) on
Windows NT 5.2 (Build 3790: Service Pack 2)

The error result is as follows,

Failed-1073548784) Executing the query "EXECUTE master.dbo.xp_delete_file
0,N'\\EXECUTE master.dbo.xp_delete_file 0,N'\ABCD-A1\BACKUPS\ABCD_BACKUP\ABC_DAILY\ABCD',N'trn',N'2008-01-13T12:52:49'" failed with the following error: "xp_delete_file() returned error 2, 'The
system cannot find the file specified.'". Possible failure reasons: Problems
with the query, "ResultSet" property not set correctly, parameters not set
correctly, or connection not established correctly.

The maintenance plan seems to be adding extra "" though when i enter the
code directly in a query i get same error.

Query:

EXECUTE master.dbo.xp_delete_file 0,N'\ABCD-A1BACKUPSABCD_BACKUPABC_DAILYABCD',N'trn',N'2008-01-13T12:52:49'

Error:

xp_delete_file() returned error 2, 'The system cannot find the file specified.'

The servers belong to the same domain and are using the same Service account which has all the necessary rights to the share and the file directory location. The backups work but i get the error on the cleanup task.

Trying to figure out how to get the Cleanup task to delete old files. The same happens for all file extensions and I have tried other locations with simpler file paths same error.


Regards,
Scott


View 6 Replies View Related

SQL 2012 :: SP For Deleting All Data From A Table

Mar 13, 2014

Is this close to the correct syntax for a stored procedure for deleting all the data from a particular table... or is there a better way?

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE TruncateTmpBank

[Code] ....

View 3 Replies View Related

SQL 2012 :: Deleting / Archiving From 100 GB Table

May 20, 2014

I would like to archive /delete data from a 100GB table. I have to delete on the basis of date column. Date column has been added to clustered index But not having an individual non clustered index.

My estimated execution plan shows a index scan.

Should I impose an non-clustered index on the date column then try to archive /delete after confirming the index seek is used in estimated execution plan or, is there any other method to do this?

View 3 Replies View Related

SQL 2012 :: Capture Deleting Job Details?

May 13, 2015

One of our Application creating run time SQL job to run batch process and application itself deleting this run time SQL job, but when doing deletion it’s not checking whether job is running or not, just doing direct delete. Our challenge is how to capture deleted job details, after incident happen we could see error details only in this log.

We can run profile to trace but we don’t know when it will trigger and we cannot possible keep active profiler as it will kill server.so we can't run the trace for log time as we don't know when the issue happens.

View 2 Replies View Related

SQL 2012 :: Deleting Doubled Records

May 26, 2015

I have this table:

CREATE TABLE [dbo].[ACT_SECUNDARIA](
[CODACTIVIDADE] [int] IDENTITY(1,1) NOT NULL,
[CODCTB] [int] NOT NULL,
[CODCAE] [int] NULL,
[CODSECTOR] [int] NOT NULL,

[Code] ....

I want to delete every record that has more than one entry (codctb; codcae)

For example: if there are three records with the same codctb and codcae I want to delete two so that there can only be one.

How can I achieve this using t-sql?

View 2 Replies View Related

SQL 2012 :: Deleting Repeated Records In Table

Sep 24, 2015

I have one table having three columns.This table contains lot of repeated records. I want to delete this records .

In this below example i want to delete all the records which columns id and no columns contains same values.

id no sequence
------------------------------------
35 35432 1
35 35432 2
35 35432 3
36 35432 1
35 45623 1

First three records the columns id and no contains same value. I want to delete this three records.

But in last record for id =35 and no column =45623.it is not repeated so it should not be deleted.

View 8 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved