Tracking Forums, Newsgroups, Maling Lists
Home Scripts Tutorials Tracker Forums
  Advanced Search
  HOME    TRACKER    MS SQL Server






SuperbHosting.net & Arvixe.com have generously sponsored dedicated servers and web hosting to ensure a reliable and scalable dedicated hosting solution for BigResource.com.







BULK INSERT - FIELDTERMINATOR


Hi, I need to bulk insert a .csv file into a database table. However, the problem I am encounter now is the in one of my field which is nvarchar(50) ... the data in .csv has ',' - comma ... which my FIELDTERMINATOR in BULK INSERT is also comma. It gives me error because the next field is an integer field. How do I solve this problem? I hereby give few data example as follow:

In .csv file:

============

"J2825JA","FEB22,MAR1,8/05 - RE","22FEB05",45,20000

"J2825JB","FEB22,MAR1,8/05 - RE","22FEB06",765,435653

Query

=========

SQL = "BULK INSERT [" & sTableName & "]" & _
          "       FROM '" & sNewFile & "'" & _
          "       WITH ( FIELDTERMINATOR = ',',    " & _
    "              FIRSTROW = 1,            " & _
          "              ROWTERMINATOR = '
' )  "

BTW, this is written in ASP. But I this is DB problem. So I posted here. Thanks in advance.


View Complete Forum Thread with Replies
Sponsored Links:

Related Messages:
The Name 'BULK INSERT Patientidx1 FROM 'e:/IDX Export3.csv' WITH (FIELDTERMINATOR = '&&",&&"') ' Is Not A Valid Identifier.
Hi All
  I want to import the data from a file(Ext .CSV) into sqlserver database through a stored proedure. I applied the logic like this
I m getting the error
The name 'BULK INSERT patientidx1 FROM 'e:/IDX export3.csv' WITH (FIELDTERMINATOR = '","') ' is not a valid identifier.

Here is the statement

create procedure ps_patient_Import (@PathFileName varchar(100)) as
declare @SQL varchar(2000)
SET @SQL =  "BULK INSERT patientidx1 FROM '"+@PathFileName+"' WITH (FIELDTERMINATOR = '"",""') "
exec @SQL

Waiting for valuable replies
Thank u
Baba

View Replies !   View Related
How Do You Use An Identity Column When Doing A Bulk Insert Using The Bulk Insert Task Editor
 

Hello,
 
I'm just learning SSIS and I've hit my first bump.  I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined.  How do I tell the bulk insert task to skip that column when inserting from the text file.  If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.
 
Thanks.

View Replies !   View Related
Bulk Insert Using Script And Not Bulk Insert Task
 

Does anyone know how to do a bulk insert using just the script task? I've been searching everyehere but can't seem to find a sample.
 
 

View Replies !   View Related
Bulk Insert - Bulk Load Data Conversion Error
Im having some issues with bulk insert.
 
This is the table:

CREATE TABLE [dbo].[tmp_GA_status](

[GA_recno] [int] NOT NULL,

[GA_desc] [varchar](40) NULL

)

 
This is the file (unicode):
1|"test1"
2|"test2"
3|"test3"
4|"test4"
5|"test5"
6|"test6"
7|"test7"
8|"test8"

 
and this is the sql:

bulk insert tmp_GA_status from 'C: empTextDumpGA_status.dta'

with (CODEPAGE='RAW', FIELDTERMINATOR='|', ROWTERMINATOR='
', DATAFILETYPE='widechar')

 
 
so yeah, pretty simple. But whatever I do I get this;

Msg 4864, Level 16, State 1, Line 1

Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (GA_desc).
 
 
 
So what am I doing wrong ?

View Replies !   View Related
I Don't Suppose BULK UPDATE Exists?... Like BULK INSERT?
I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.

Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?

Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!

View Replies !   View Related
Pros: How To Bulk Delete And Bulk Insert?
I have a table containing 8 million records.
I need to replace 2 million of these records with
a scaled down query that goes something like:
SELECT 1, ShareholderID, Assets1
FROM MyTable (Yields appx. 200,000 recods)
SELECT 2, ShareholderID, Assets2
FROM MyTable (Yields appx. 200,000 recods)
.
.
.
SELECT 10, ShareholderID, Assets1 + Assest2 + Assets3 + ... + Assets9
FROM MyTable (Yields appx. 200,000 recods)

Updates and cursors just seem to be too slow.

So far I have done the following, but was wondering if anyone could think of a better way.
SELECT 6 million records that don't need to be deleted into a #TempTable
Use statements above to select into same #TempTable
DROP and recreate Original Table
SELECT 6 + 2 million records INTO original table.

This seems rather convoluted. Is there a better approach? Would it be worth while to dump data to a file and use bcp / Bulk Insert


Any comments are appreciated,

-Marc

View Replies !   View Related
Error: 0xC002F304 At Bulk Insert Task, Bulk Insert Task: An Error Occurred With The Following Error Message: &&"Cannot Fetch A Row
 
I receive the following error message when I try to use the Bulk Insert Task to load BCP data into a table:

 
Error: 0xC002F304 at Bulk Insert Task, Bulk Insert Task: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 4. Verify that the field terminator and row terminator are specified correctly.Bulk load data conversion error (overflow) for row 1, column 1 (rowno).".

Task failed: Bulk Insert Task
 
In SSMS I am able to issue the following command and the data loads into a TableName table with no error messages:
BULK INSERT TableName
    FROM 'C:DataDbTableName.bcp'
   WITH (DATAFILETYPE='widenative');

 
What configuration is required for the Bulk Insert Task in SSIS to make the data load? BTW - the TableName.bcp file is bulk copy file as bcp widenative data type. The properties of the Bulk Insert Task are the following:
DataFileType: DTSBulkInsert_DataFileType_WideNative
RowTerminator: {CR}{LF}
 
Any help getting the bcp file to load would be appreciated. Let me know if you require any other information, thanks for all your help.
Paul

 

View Replies !   View Related
Questions About Bulk Copy Insert Using 'Memory Based Bulk Copy Operations'
Hi~,

Before implementing memory based bulk copy insert  with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.

- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility

- SQL Server's resource usage : when running memory based bulk copy, server resource's influence

- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?

- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit

- any other guide lines

View Replies !   View Related
FIELDTERMINATOR
I want to use Bulk Insert but my FIELDTERMINATOR is Ascii 1. The default is "". How do I specify a different ascii character as a field terminator.

View Replies !   View Related
Can I Insert/Update Large Text Field To Database Without Bulk Insert?
I have a web form with a text field that needs to take in as much as the user decides to type and insert it into an nvarchar(max) field in the database behind.  I've tried using the new .write() method in my update statement, but it cuts off the text after a while.  Is there a way to insert/update in SQL 2005 this without resorting to Bulk Insert? It bloats the transaction log and turning the logging off requires a call to sp_dboptions (or a straight-up ALTER DATABASE), which I'd like to avoid if I can.

View Replies !   View Related
How To Insert Data From A File Into Table Having Two Columns-BULK INSERT
 

Hi,
i have a file which consists data as below,
 
3
123||
456||
789||
 
Iam reading file using bulk insert and iam inserting these phone numbers into table having one column as below.
 

BULK INSERT TABLE_NAME FROM 'FILE_PATH'
WITH (KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '||')
 
but i want to insert the data into table having two columns. if iam trying to insert the data into table having two columns its not inserting.
 
can anyone help me how to do this?
 
Thanks,
-Badri
 

View Replies !   View Related
Import Text Files - Sql 7 - "FieldTerminator"
Hi ppl,
I'm trying to import a fixed lenght text file into a sql 7.0 table using the SQL command "BULK INSERT".

The text file is fixed length and end with a carriage return and linefeed. Fields aren’t delimited.

Wich definition of the "FIELDTERMINATOR" should i use ?

I've used "" it doesn't work.

I need some Help here!!!
Thanks.

View Replies !   View Related
Insert Trigger For Bulk Insert
In case of a bulk insert, the “FOR INSERT� trigger fires for each recod or only once?
Thanks,

View Replies !   View Related
Compare BULK INSERT Vs INSERT
Hello,
I am wondering is the Transaction Log logged differently between BULK INSERT vs INSERT?  Performance speaking, which operations is generally faster given the same amout of data inserted.

Sincerely,
-Lawrence

 

View Replies !   View Related
Questions About Memory Based Bulk Copy Operation(InsertRow Count,array Insert Directly,set Memory Based Bulk Copy Option)
Hi~, I have 3 questions about memory based bulk copy.

1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)?
For example, how much insert row at below sample?(the max value of nCount)
for(i=0 ; i<nCount ; i++)
{
    pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData));
}

2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)

3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ?
BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);

-------------------------------------------------------
My solution is like this. Is it correct?

// CoCreateInstance(...);
// Data source
// Create session

 m_TableID.uName.pwszName = m_wszTableName;
 m_TableID.eKind = DBKIND_NAME;

 DBPROP rgProps[1];
 DBPROPSET PropSet[1];

 rgProps[0].dwOptions  = DBPROPOPTIONS_REQUIRED;
 rgProps[0].colid   = DB_NULLID;
 rgProps[0].vValue.vt  = VT_BSTR;
 rgProps[0].dwPropertyID  = SSPROP_FASTLOADOPTIONS;
 rgProps[0].vValue.bstrVal = L"ROWS_PER_BATCH = 10000,TABLOCK";

 PropSet[0].rgProperties  = rgProps;
 PropSet[0].cProperties  = 1;
 PropSet[0].guidPropertySet = DBPROPSET_SQLSERVERROWSET;

 if(m_pIOpenRowset)
 {
  if(FAILED(m_pIOpenRowset->OpenRowset(NULL,&m_TableID,NULL,IID_IRowsetFastLoad,1,PropSet,(LPUNKNOWN*)&m_pIRowsetFastLoad)))
  {
   return FALSE;
  }
 }
 else
 {
  return FALSE;
 }

View Replies !   View Related
Do Not Keep NULLS Using SSIS Bulk Insert Task - Insert Empty Strings Instead Of NULLS
I have two SSIS packages that import from the same flat file into the same SQL 2005 table.  I have one flat file connection (to a comma delimited file) and one OLE DB connection (to a SQL 2005 Database).  Both packages use these same two Connection Managers.  The SQL table allows NULL values for all fields.  The flat file has "empty values" (i.e., ,"", ) for certain columns.

The first package uses the Data Flow Task with the "Keep nulls" property of the OLE DB Destination Editor unchecked.  The columns in the source and destination are identically named thus the mapping is automatically assigned and is mapped based on ordinal position (which is equivalent to the mapping using Bulk Insert).  When this task is executed no null values are inserted into the SQL table for the "empty values" from the flat file.  Empty string values are inserted instead of NULL.

The second package uses the Bulk Insert Task with the "KeepNulls" property for the task (shown in the Properties pane when the task in selected in the Control Flow window) set to "False".  When the task is executed NULL values are inserted into the SQL table for the "empty values" from the flat file.

So using the Data Flow Task " " (i.e., blank) is inserted.  Using the Bulk Insert Task NULL is inserted (i.e., nothing is inserted, the field is skipped, the value for the record is omitted).

I want to have the exact same behavior on my data in the Bulk Insert Task as I do with the Data Flow Task.
 
Using the Bulk Insert Task, what must I do to have the Empty String values inserted into the SQL table where there is an "empty value" in the flat file?  Why & how does this occur automatically in the Data Flow Task?
 
From a SQL Profile Trace comparison of the two methods  I do not see where the syntax of the insert command nor the statements for the preceeding captured steps has dictated this change in the behavior of the inserted "" value for the recordset.  Please help me understand what is going on here and how to accomplish this using the Bulk Insert Task.

View Replies !   View Related
Something Like A Bulk Insert...
Hi,I've a SP that insert records in one table and then call another insert SP on a second table. The first table is like a master table and the second is like a child table. After inserting the right record in the master table, I've to insert some record in the child table. This records differ each other only by two of about ten field, so what I'd want is not to call the second SP X times, but only one time.. Is it possible??ExampleTable1: Id (identity), Desc;Table2: Id (identity),  Id_table1, Id_TableX, Num, Field1, Field2, ... Field10.In Table2 only Id_TableX and Num change every time... the other are all the same (for one record in Table1). How can I do? Probably with a bulk insert and a bulk update?? But, can I make a bulk xxx without a file??

View Replies !   View Related
BULK INSERT
hi friends i am using bulk insert cmd using my table name but i am facing error.....SO
IS IT POSSIBLE TO USE BULK INSERT WITH TEMPRARY TABLE VARIABLE
PLZ HELP ME

View Replies !   View Related
Bulk Insert
hi friends i am trying for bulk insert using SQL server 2000using this codebulk insert xyzfrom  'D:authors.txt'WITH (FIELDTERMINATOR = ',') but it gve me error  saying thatCould not bulk insert because file 'D:authors.txt' could not be opened. Operating system error code 21(error not found). i check file securityit has given full control to the file can any one give me idea about Operating System error code 21(error not found)  thanks 

View Replies !   View Related
Bulk SQL Insert
I am currently working on a simple page to insert 1.6 million UK postcode records into an SQL server table. The table has three columns for the postcode, longditude coordinate and lattitude coordinate. The data is sourced from a pipe (|) delimited txt file and inserted into the database using a FOR loop. The problem I have is that the page will hang after inserting only 10,000 records, the page displays either an invalid View State error or a page cannot be found error.
Now I assume the viewstate error stems from the fact that there is a form on the page which simply contains a button to execute the script and a few labels to show the progress. But without the form and associated viewstate the insert still fails to complete.... any ideas?? Would I be better running this on a thread or should I just do it in stages and be patient. I have now modified the page to read the database on load and pick up from where it crashes?

View Replies !   View Related
Bulk Insert
Sorry for the piece-by-piece nature of this post, I moved it from adormant group to this one and it was 3 separate posts in the othergroup. Anyway...I'm trying to bulk insert a text file of 10 columns into a table with12. How can I specify which columns to insert to? I think formatfiles are what I'm supposed to use, but I can't figure them out. I'vealso tried using a view, as was suggested on one of the many websitesI've searched, but I clearly did that incorrectly as well.--------------------------------Update:I'm working with the view, and I've got a view that contains the exactcolumns from the table I want. I ran my bulk insert command,BULK INSERT Test..IV10401 FROM 'c:ulkInsertFile2.txt'and it returned the error:Server: Msg 2601, Level 14, State 3, Line 1Cannot insert duplicate key row in object 'IV10401' with unique index'AK2IV10401'.Note: Bulk Insert through a view may result in base table defaultvalues being ignored for NULL columns in the data file.The statement has been terminated.The AK2IV10401 key is comprised of 3 columns and I'm sure that each ofmy rows in the insert file is unique according to those three. Whatshould I be checking for?-----------------------Update 2:I can only successfully insert 1 row. It seems to be treating each rowas an individual primary key when it should be treating them ascomposite keys. I cannot alter the table, since it was created byGreat Plains Dynamics. Is there some sort of switch that I'm missingin my bulk insert statement or can I suppress the errors?

View Replies !   View Related
Bulk Insert
HiI have a text file with this information-BEGIN----------------- tekst.txt-------------10, "firstname", "lastname"11, "Mette", "Larsen"--| |--6 000 000, "Michael", "Houmaark"-END-------------------- tekst.txt-------------I use this SQL-query-BEGIN------------------SQL--------------bulk insert tlf.dbo.bruger_data from 'C:TEKST.txt'with(FIRSTROW = 1,FIELDTERMINATOR = '";"',ROWTERMINATOR = '"')-END--------------------SQL--------------But when the data is in the table its still have the " arround the firstnameand lastnamewhat do I do ?????Best RegardsMichael H

View Replies !   View Related
Almost There With Bulk Insert
I have BULK INSERT T-SQL statements that work for all of my basic datatables except for one.Here is the problem statement (with obvious replacement of irrelevantpath info):BULK INSERT igbconts_tmp FROM 'C:\my_code_path\IGBCONTS.txt'WITH (KEEPNULLS,FORMATFILE = 'C:\my_data_path\contacts.fmt');And here is the output from this statement:Msg 8152, Level 16, State 14, Line 3String or binary data would be truncated.The statement has been terminated.(0 row(s) affected)This tells me precisely nothing about where the real problem lies. Iam reluctant to post either the table definition or the format filesince they are large (the table, and thus the data file, has 104fields. However, the first few lines in the format file are:8.01051 SQLCHAR 0 0 """ 0 dummy ""2 SQLCHAR 0 0 "","" 1 contact_id ""3 SQLCHAR 0 0 "","" 2 full_name ""4 SQLCHAR 0 0 "","" 3 last_name ""And here are the last couple lines:104 SQLCHAR 0 0 "","" 103 user_defined_field15 ""105 SQLCHAR 0 0 ""
" 104 user_defined_field16 ""The table was created using the string length information given to usby the data provider, and those fields that are not strings consist ofa few datetime values and a moderate number of floating point numbers.The message suggests to me that one of the fields is too small for whatwas actually found in the corresponding column in the data file for atleast one record. But in addition to there being over 100 columns,there are several thousand records in the data file!How do I determine precisely where the problem lies?Thanks,Ted

View Replies !   View Related
Bulk Insert
Hi,

I am trying to do the data transfer using BULK INSERT from a dat file. And the data is only one row .
The bulk insert is giving me the error
"Bulk insert data conversion error (truncation) for row 1, column 11 (extension1)"
The line given below is that data as it appear in the dat file.

'20030715','Logiciel','Taching, Inc.','108 Pierson Ave',,'Edison','NJ','08837','USA','732-603-7877',,,,,,,,,,,,,,,,,,,,'N','N','N','N','N','N',, ,2003/jul/15 11:09:33.718000,,,,


And that column 11(name of that column is extension1) has no data and the datatype in SQL server for that column is varchar(1).

I am not able to figure out why this error. Can anyone help.
:mad:
Regards

View Replies !   View Related
Bulk Insert
I am trying to Bulk Insert into Database from a program in VC++ on to sql2000 database.My code was successful if i use the credentials of SA user.
but i need to execute the same process with different user which has DBO privileges on the database in which i have the table.Can anyone tell me what privileges need to be granted for this dbo user to make bulk insert work.
thanks a lot for your help.
anna

View Replies !   View Related
BULK INSERT Maybe ???
I have a directory with files:

X.dbf
X.mdx
Y.dbf
Y.mdx
...

These files contain updates for my DB (I don't know their structure). How I can insert them in temporary tables on the SQL server ?

Note: I don't want to use Import/Export Tool, cause I will need this insert as scripts...

View Replies !   View Related
Bulk Insert
I'm trying run a bulk insert statement to insert data into an existing table:

Here is an example of the text file data:

"BEGIN_APP_YR","2001"
"BISP_EXPD_THRU","200512"
"BISP_ITER","PRELIM"
"BISP_LAST_PUB_DT","02/14/200612:41PM"
"BISP_YR","2007"
"BISP_YRS","3"
"END_APP_YR","2006"

This is the bulk insert statement I'm using:

BULK INSERT AFR.dbo.[BISM_CONFIG]
FROM 'c:sqldefaultism_config.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = ''
)


I'm getting a syntax error near ')'

Any help is appreciated.
Thanks,
-D-

View Replies !   View Related
Bulk Insert
I have a problem with bulk insert importation. I have a txt file with 3400 columns and 1 table with two fields.

data - varchar(5000)
id - numeric (identity yes - 1)

when I import the file the system return this error message:

Bulk Insert fails. Column is too long in the data file for row 1, column 1. Make sure the field terminator and row terminator are specified correctly.

my bulk insert sintaxe is:

BULK INSERT eflex.dbo.Import_Data
FROM 'C:HPHP_ReceiveFBSJ01D5.txt'
WITH (FIELDTERMINATOR = '')

can anybody help-me?

thanks,

View Replies !   View Related
Bulk Insert
Hi,

I want to use bulk insert using a text file to import data from it to the DB, but how can I ignore the primary keys in the text file which is in the db table.

regards,

View Replies !   View Related
Bulk Insert
I'm trying to import data from a text file into my database. each line of this text file must be inserted into 2 seperate tables depending on characters in the line of text. any ideas about how to do this without using a great deal of cpu would be appreciated.


thanks..... :confused:

View Replies !   View Related
Bulk Insert
I often encountered to insert multiple records into a certain table, such as:
DESCRIPTION AMOUNT
SALARY 8000
ALLOWANCE 100
CASH GIFT 400
FOOD ALLOWANCE 460

SCENARIO:
I want to insert at one time the example written above into TBLcompensation table with the following fields,
ID(identity,1), DESCRIPTION ,AMOUNT

QUESTION:
Is there other ways to INSERT multiple records at one time?

MY SOLUTION:
I collected all records and concatenated it as one string with a special character separating between fields and rows. Then i do the looping on the STORE PROCEDURE. I believe this is not an effecient way.

View Replies !   View Related
Bulk Insert Every 3rd Row
I am trying to bulk insert every third row of a file into a table.Is there a way that I can do it?

View Replies !   View Related
BULK INSERT
Hi,

I want to use BULK INSERT statement instead of BCP to insert records in my table. These inserts are done daily. I want to call these stmt from VB and the data file will be a file server.

I am getting the following problems.

Cannot find file - Even though the network drive is mapped, does the drives has to be mapped under a particular account ???
User has to be member of sysadmin

Any ideas ???

View Replies !   View Related
Bcp /bulk Insert!!!
1> Is there any one know how to use the 'bulk insert' within a STORED PROCEDURE ?

2>The bcp utility is working fine under the command line enviroment
but somehow , it does not work for me under the MS query analyzer (using with the system procedure xp_cmdshell to run the same command as the above). The xp_cmdshell is working fine with other executable but I don't know why the process seems to be running endless when the bcp utility called with it.

I appreciate for your time and reply.

Regards

View Replies !   View Related
BCP-vs-Bulk Insert-vs-DTS
We've been using Bulk Insert to load our tables. But, recently we encountered this error message "There is insufficient system memory to run this query. [SQLSTATE 42000] (Error 701). The step failed."

Then, our DBA suggested that we use BCP. It seems to work fine until the file size exceeds 30MB. This is the message I get "Starting copy... 0 rows copied. Network packet size (bytes): 4096 Clock Time (ms.): total 14984. Process Exit Code 0. The step succeeded." Is this a known problem?

Then, we decided to use DTS. DTS seems to be able to handle any file size but it's a slower process than the other 2. Any suggestions?

View Replies !   View Related
BULK INSERT
I am trying use the BULK INSERT command without any success.

I have a flat file called names.txt which contains :

Anthony,Kieran,29
Reggie,Patel,30
Benny,Kieran,25
Sinead,McParland,28
Paula,Daly,29
Derek,Daly,30

I created a blank table called NAMES
fname,varchar,30
lName,varchar,30
age,numeric,9

In SQL Query Analyzer I run this command :

BULK INSERT names
FROM 'X:INETPUB2000ames.txt'
WITH (FIELDTERMINATOR = ',',
ROWTERMINATOR = '|',
DATAFILETYPE='CHAR'
)

I get the "Command Completed Successfully" reply but NO records are created in the Table??? Can anyone please help me.

Thanks in advance,

Anthony

View Replies !   View Related
Bulk Insert
we have this application which perform ODBC bulk insert to a numbers of tables, and the process repeated many times. We discover that a curious pattern. For every third sequence of insertion, it will take 4 times as much time. Specifically, one table will take 15 times as long.
for example, the first, second time, the insert process will take 10 seconds, this table A will take 2 seconds(about 5000 rows) then at the third time, the total space will be 30 seconds, and table A will take 22 seconds. The 4th, 5th time go back to normal(around 10seconds), the 6th time, again it will take more than 30 seconds. This table is in a separate file group.
Is there some area I can look at to pin point the cause Urgent help is needed!!

View Replies !   View Related
Bulk Insert
Can anyone help me on how to do a bulk insert from a text file into sql server , some facility similar to
sql loader of oracle .. thank you in advance

View Replies !   View Related
Bulk Insert
I am trying to do a bulk insert from a data file into a linked access database. When I run the query I get the error message:
'Server: Msg 4801, Level 16, State 81, Line 1
Bulk_main: The opentable system function on BULK INSERT table failed.
Not sure what the problem is because BOL just says to check Microsoft.com for updated error message information. However, when I went to the site there was no updated information. Has anyone else seen this error? If so, have you figured out the problem? Any help would be greatly appreciated. Thanks.


Jim

View Replies !   View Related
Bulk Insert
howdy:

I keep getting the following error msg:
<<Bulk insert data conversion error (truncation) for row 17, column 6 (col6).>>

I created a test file of 100 rows in EXCEL and saved as tab-delimited and then I run the following from the query analyzer:

bulk insert ssdi..dmf FROM 'C:devpb6ccsssdiest.txt'
WITH
(DATAFILETYPE='char',
FIELDTERMINATOR='',
ROWTERMINATOR='',
KEEPNULLS,
MAXERRORS=10,
TABLOCK)


here's my table:
create table dmf
( col1 char(9) not null,
col2 char(12) null ,
col3 char(10) null ,
col4 char(8) null ,
col5 char(8) null ,
col6 char(12) null ,
col7 char(12) null ,
constraint pk_dmf primary key (ssn))

note:
I don't always have values in the last 2 columns
I installed SP2.

TIA
deanna

View Replies !   View Related
Please, Help For Bulk Insert
Hello everybody,

I've got a table
CREATE TABLE dbo.items
(
[id] integer NOT NULL,
[name] char(40) NULL,
type char(10) NULL,
PRIMARY KEY (id)
)
go

And I do

BULK INSERT dbo.ITEMS FROM 'C:AXELUNLOAD193.dat' WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '')

Here are every extract lines from 193.dat :
1,'comStatGen','statstable'
2,'histoChart','zoomchart'
3,'comTrafWeek','statstable'

I don't find how to suppr the quotes so it can't be ok with type char(10) but type char(12) with quotes only

Does somebody know how to proceed. I must suppr these quotes !!!
Thanks
Axel

View Replies !   View Related
Bulk Insert Vs BCP
Hi,

Which is a faster method -- bulk insert or BCP

And I assume DTS has no problem handling either one? The input files are between 400 meg and 1 gig.

Thanks,
Judith

View Replies !   View Related
Does DTS Do Bulk Insert?
Hi,

I want to move data from a text file to a SQL table. After DTS creates the table, does it use Bulk Insert to copy the data from the file to the table, or BCP?

Thanks,
Judith

View Replies !   View Related
Bulk Insert
I want to use Bulk Insert but my FIELDTERMINATOR is Ascii 1. The default is "". How do I specify a different ascii character as a field terminator.

View Replies !   View Related
Bulk Insert ?
How does one perform a bulk insert?

View Replies !   View Related
Bulk Insert ?
How can I do a bulk insert without the transaction being logged?

View Replies !   View Related
Bulk INsert
Import Text FIle TO SQL Server 7.0 Table using DTS WOrks Fine. But

BULK INSERT ABCD
FROM 'S:abcd.txt'
WITH (FIELDTERMINATOR = '|', ROWTERMINATOR='')
Fails And Gives Following Error
----------------------
Server: Msg 4866, Level 17, State 66, Line 1
Could not bulk insert because column too long in data file.
Make sure FieldTerminator and RowTerminator were specified correctly.
-----------------------
What is the reason for this????????

View Replies !   View Related
DTS And Bulk Insert
I have a directory with approximately 1,000 .txt files which I need to import into a MS SQL 7 table. Any ideas on how to write the VBScript to loop and import every text file? I'm thinking that using a global variable will work.

View Replies !   View Related
SQL Bulk Insert Or BCP
Hi,

Does anyone know if it is possible when using SQL Bulk Insert or BCP will allow for a variable number of columns in the input csv file? Or is it a requirement for these two commands to have a fixed number of cols in the input file?

I'm using Bulk Insert with fieldterminator = '","' and rowterminator = '' for the delimited csv file, but my input file does not have a fixed number of cols, and is not padded out with the appropiate number of empty ",".

Is there a way around this?

Roger

View Replies !   View Related
BULK INSERT
hi
how i can write to store procedure to use bulk insert command to import database table to csv file . i have append record to existing record so how i can write please help.........

View Replies !   View Related
BULK INSERT
Hi when i try and do a simple bulk insert
i get this error.

Operating system error code 1326(Logon failure: unknown user name or bad password.).


yet i can access the file fine when i eneter the path into the run box on the same SQL server .. ?

View Replies !   View Related

Copyright © 2005-08 www.BigResource.com, All rights reserved