Hello, I have a problem scheduling an Object Transfer task on a server that is in a different domain than the server it is getting the data from. I can set up the Object Transfer through the Tools/DatabaseObject Transfer... option and run it manually (works great), but if I schedule it to run, it stops after about 30 seconds and gives an error message "Process Exit Code 1. Cannot connect to source server." Any ideas would be much appreciated. Thank you Ryan
Hi All, i have mutiple text file. let us say,a1.txtb1.txtc1.txt i have to port this text file data into the table (SqlServer Database) which have the same file structure.(i.e)x1 (SqlServer table)y2 (SqlServer table)z3 (SqlServer table) now i have to transfer a1.txt file data ----to--- x1b1.txt file data ----to--- y2c1.txt file data ----to--- z3 using SSIS. like that, i have to transfer more than 250 files at a time.manually binding 250 files into the package is very cumbersome and time consuming process. so, can any one give ur valuable sugession to solve this issue.
I am using DTS to transfer some tables from one server to another as part of a migration. We want to be down for as little time as possible, but we need the most up-to-date copy of the database tables in question.
I am currently testing the transfer process in our test environment by migrating the data from one database to another on the same SQL instance.
There are 7 tables to transfer and the total size of the database is 450 MB (with around 117 MB used). The two largest tables have around 17,000 records each.
One table (the header) has no text column and it takes just a few seconds to transfer. The other table (the detail) has two columns, one of which is a text column (actually, its not fair to call it the detail table; the relationship is actually one-to-one, but for the sake of this discussion, let's leave it at that).
The header takes seconds to transfer, but the detail takes up to 18 minutes.
Physically, our test server is quite robust; 2 processors, a 3 disk RAID-5 for the data files and a separate RAID 1 partition for the logs. Performance counters don't indicate any real issues: during the transfer, the disk utilization on the data partition occasionally spikes to a high level, but comes right back down until the next spike (the spikes being separated by about 1 minute. No issues with memory, paging or CPU.
I have removed the clustered index on the affected table as well as the PK. No help.
Are text columns just slow? Is there something that I am missing?
I'm having problems designing a package to attempt to execute a fast load data transfer but failback to regular speed with error redirection in the event of an error.
The way I designed this was to add one data flow task to my package called "DFT FASTLOAD". The data flow copies a table SRC to another table DEST in the same SQL Server database. In the error handler for the data flow task I copied the original data flow task and changed the name to "DFT REGULARLOAD with Error redirection". In this data flow task I did not use fast load and addtionally redirected errors to a text file.
In the Data Flow Task "DFT FASTLOAD". I am copying from a varchar source field(with non-date strings) to a datetime destination field to force errors. However the Data Flow Task "DFT REGULARLOAD with Error redirection" never seems to start transferring data from source to destination. The data Flow Task "DFT REGULARLOAD with Error redirection" turns yellow (after the error occurs in "DFT FASTLOAD"), but no data is being transferred). It seems like it hangs.
Do I need to increase the MaximumError Count or something? The data flow task "DFT FASTLOAD" does not turn red when the error occurs it just remains yellow, so i assume I'm on the right track since it seems the error is caught.
I have added screenshots ... hopefully these screenshots will clarify my problem.
Is it possible to configure transnational replication between two different domains also non trusted domains.
It's possible means what i need to take care before configure replication and how to configure transnational replication between two different domains.
Greetings, I have just arrived back into the country (NZ) and back into ASP.NET. I am having trouble with the following:An attempt to attach an auto-named database for file (file location).../Database.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share. It has only begun since i decided i wanted to use IIS, I realise VWD comes with its own localhost, but since it is only temporary, i wanted a permanent shortcut on my desktop to link to my intranet page. Anyone have any ideas why i am getting the above error? have searched many places on the internet and not getting any closer. Cheers ~ J
I have been playing with a Standard Version of VS 2005 and SSE 2005 and I just cannot get these two to interact together well. I am sure it is a noob problem but I have seen this error addressed on this forum and I am just not getting it. Here is exactly what I am doing.
I want to create a database within SSE using the Management Studio. Then, I want to connect to it with VS2005. Both SSE and VS2005 are local. I just cant seem to get this to work.
I always seem to get this error "An attempt to attach an auto-named database for file C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataSecond.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share."
One thing... I create the website with Location: HTTP. I don't even know anymore why I do this.. I think because I like to be able to pull up a browser and get to my app. Other than that I am doing nothing special. However, I get the above error when I hit F5 to run the app inside VS2005.
Is there someplace where a complete noob can find a thorough and simple covering of this? Again, I have read through some of the previous posts on this and I don't understand too many of them and it seems as if there are at least 13 causes of which I don't know which one I have. Help!
I am asked to create 100 procedures to a database. Any best way to create them in a database one by one by calling the files and saving the execution output files in a folder?
I can easily find user created stat in a databaseSELECT * FROM DB.sys.stats WHERE user_created=1But how do I determine what tables those stats are in? with over 6000 tables I don't feel like looking through all the tables.
Is it possible/advisable when transfering very large amounts of data from server to server to: trasnfer the data to a new table first second alter new table adding indexes, defaults, ets based on original table
if it is what flow item would be used to transfer/alter the indexes and defaults?
I'm very new to ssis so the more detail you can give the better.
I have an access database (access 95 Version7)dumping a delimited text file onto my server. I am then using DTS in SQL 2000 to import the file into a table.
My issue is that each time the DTS runs, it imports the whole text file each time, this is causing duplicate records.
So I created a transformation script as follows :
Function Main()
If DTSSource("counter") <= DTSDestination("counter") Then Main = DTSTransformStat_SkipRow Else
The theory behind the If statement, is if it sees that the counter field is less than or equal to what is there, it will skip the record and move forward. For some reason this is not working.
Does anyone have a workaround or another solution to this problem
I have portions of data coming in as text files containing new records and updates of existing records. The solution I've figured out till yet is to import a portion of data into some intermediate table and then run a stored procedure to migrate the data into the real table. Any ideas how to do this in a more efficient way? Thanks in advance,
I am familiar with the MySQL Load Data command to load an external ascii file into a database table, but am having trouble finding a T-Sql command that is equivalent without creating an executable...any help would be appreciated...
Anbody please help I am trying to export a text file to a table using enterprise manager and all tasks But the process keeps adding strange charater like squares at the end of each line and also replaces each empty line in the text file with a record in the table with that square type character. I used the following code to delete all rows with that character (as a work around) but no joy. I am losing hope.
How to convert a SQL table into Text file? I have a table and I want to extract the values with the field names above to a text file. The query should also allow me to define the starting position of the fields in the text file.
Is there an example anywhere of how to output selected fields in a sql table to a text file with fixed length fields. ie pad data out to required length.
I am relatively new to SQL and as a project I have been asked to create the SQL for a simple database to record train details. I want to implement a check constraint which will prevent data from being inserted into a table if the weight of the train is more than the maximum towing weight of the locomotive. FOr instance, I need to add the unladen weight and maximum capacity of each wagon (located in the wagon type table) and compare it against the locomotive maximum pulling weight (the locomotive class table). I have the following SQL but it will not work:
check((select SUM(fwt.unladen_weight+fwt.maximum_payload) from hauls as h,freight_wagon as fw,freight_wagon_type as fwt,train as t where h.freight_wagon_serial_number = fw.freight_wagon_serial_number and fw.freight_wagon_type = fwt.freight_wagon_type and h.train_number = t.train_number) < (select lc.maximum_towing_weight from locomotive_class as lc,locomotive as l,train as t where lc.locomotive_class = l.locomotive_class and l.locomotive_serial_number = t.locomotive_serial_number))
The hauls table is where the constraint has been placed and is the intermediary table between train and freight wagon.
I may not have explained this very well; but in short, i need to compare the sum of two values in one table against a values located in another table...At present I keep getting a message telling me the sub query cannot return more than one row.
I am writing program in VC++ through SQl-DMO calls.My problem is when i when i tranfer(import) a text file(comma seperated) into SQl server through a SQl-DMO method called ImportData which is a method of Bulk copy object.Its is not able to convert the data field in the text file to corresponding value datetime in SQl server whereas other data types are working perfectly.
This is the record i need to convert:
90,MichaelB,Wintriss,Inspection,Paper,11,Job101,1, {ts '2000-12-10 15:54:56.000'},D:public233 and 247233.mcs,
and this is the date field {ts '2000-12-10 15:54:56.000'}
Whereas if i export a table in SQl server in Binary mode and then import the file back it works but when do it as text it gives the above error
Pls help me in this i would be very thankful to you.
I need to export data from a table to a text file, where the data in the table is deleted after written to the file. It is simple using DTS, but I want to do the export in "chunks" of data, committing the delete say after every 1000 rows.
My thought was a stored procedure would be easy enough to do this (done these in Oracle many times), but I don't know the quickest way to export a row of data from a stored procedure to a text file. Isn't using a command-line shell too slow? What are my options?
Hey guys, I have a dilemma and hope someone can help.
I don't know of any utilities or commands in SQL that do this but I hope someone does.
What I need to do is something like a bcp import a text file in. I can do that with DTS as well. But what I wanted to do is create a table on the import. So lets say, I am importing a tab-delimited file with column names as the first row that is called ax.txt. On import, it would create the table ax with the column names in the file and then import the data into that table.
I hope I explained it clearly. Please let me know if there is anything I can use to do this without writing lots of code.
I have an idea how to do it the long way but hope there is a utility that already does it.
I am very very new to sql server 2005.I want to create SSIS package for my text file to transfer in table.How i do this and where i will get that SSIS package (like in sql server 2000 we get that in EM under DTS) and how do i schedule these packages.
Dear MSSQL- experts,I have a strange problem with SQLSERVER 2000.I tried to export a table of about 40000 lines into a text file usingthe Enterprise managerexport assitant. I was astonished to get an exported text file of about400 MB instead 16 MB which is the normal size of that data.By examining this file with a text editor I found that the fileincluded alongside the data of my table MANY zeros which caused the bigfile size.Does someone of you have an idea what could cause the export oftrillions zeros into my textfile and how to only export the significantdata of my table ?Best regards,Daniel