Tracking Forums, Newsgroups, Maling Lists
Home Scripts Tutorials Tracker Forums
  Advanced Search
  HOME    TRACKER    MS SQL Server






SuperbHosting.net & Arvixe.com have generously sponsored dedicated servers and web hosting to ensure a reliable and scalable dedicated hosting solution for BigResource.com.







Rows Per Batch Vs. Maximum Insert Commit Size


This has to do with an OLE DB destination with the fast load data access method.

1. I have an intuition about what rows per batch means, but I've read that -1 is the default value -- which means simply that it hasn't been set.  In that case, how many rows are buffered into each batch?

2. What are the units for maximum insert commit size (bytes, kilobytes, megabytes, gigabytes, ...)???

3. I'm working on a package for a large bulk load that has about a dozen concurrent data flow tasks, each of which uses the same OLE DB destination (well, a copy).  Of course I've disabled the destinations' table locks so these tasks don't deadlock.  Given that each of these data flows is writing several million pretty small rows, how should I set those options?  (The rows consist of 3 unique identifier columns plus a nvarchar(868) column that's usually less than 20 characters.)

I can't afford the time to experiment and benchmark various settings -- I just want a theoretical understanding of how these options interact, especially in this concurrent data flow scenario.

Thanks!


View Complete Forum Thread with Replies
Sponsored Links:

Related Messages:
Maximum Insert Commit Size
 

Hi, All,

 

 if I set the "Maximum insert commit size"  to 10 ( 0 is the default) in a OLE destination,

what does the 10 means? 10 records or 10 MB/KB of data?


Thanks

 

View Replies !   View Related
OLE DB Destination - Fast Load With Maximum Insert Commit Size
I'm seeing some strange behavior from the OLE DB Destination when using the "fast load" access mode and setting the "Maximum insert commit size".

When I do not set the "Rows per batch" or the "Maximum insert commit size", the package I'm working with inserts 123,070 rows using a single "insert bulk" statement.  The data seems to flow through the pipeline until it gets to the OLE DB Destination and then I see a short pause.  I'm assuming the pause is from the "insert bulk" statement handling all of the rows at once.

When I set the "Rows per batch" option but leave the "Maximum insert commit size" alone, I generally see the same behavior -- a single "insert bulk" statement that handles all 123,070.  In this case, however, the "insert bulk" statement has a "ROWS_PER_BATCH" option appended to the statement that matches the "Rows per batch" setting.  This makes sense.  I'm assuming the "insert bulk" then "batches" the rows into multiple insert statements (although I'm unsure of how to confirm this).  This version of the "insert bulk" statement appears to run in about the same time as the case above.

When I set the "Maximum insert commit size" option and leave the "Rows per batch" statement alone, I see multiple "insert bulk" statements being executed, each handling the lower of either the value I specify for the "Maximum insert commit size" or the number of rows in a single buffer flowing through the pipeline.  In my testing, the number of rows in a buffer was 9,681.  So, if I set the "Maximum insert commit size" to 5,000, I see two "insert bulk" statements for each buffer that flows into the OLE DB Destination (one handling 5,000 rows and one handling 4,681 rows).  If I set the "Maximum insert commit size" to 10,000, I see a single "insert bulk" statement for each buffer that flows into the OLE DB Destination (handling 9,681 rows).

Now the problem.  When I set the "Maximum insert commit size" as described in the last case above, I see LONG pauses between buffers being handled by the OLE DB Destination.  For example, I might see one buffer of data flow through (and be handled by one or more "insert bulk" statements based on the "Maximum insert commit size" setting), then see a 2-3 minute pause before the next buffer of data is handled (with its one or more "insert bulk" statements being executed).  Then I might see a 4-5 minute pause before the next buffer of data is handled.  The pause between the buffers being passed through the OLE DB Destination (and handled via the "insert bulk" statements) is sometimes shorter, sometimes longer.

Using Profiler, I don't see any other activity going on within the database or within SQL Server itself that would explain the pauses between the buffers being handled by the OLE DB Destination and the resulting "insert bulk" statements...

Can anyone explain what is going on here?  Is setting the "Maximum insert commit size" a bad idea?  What are the differences between it and the "Rows per batch" setting and what are the recommended uses of these two options to try to improve the performance of the insert (particularly when handling millions of rows)?

TIA for any thoughts or information...

Dave Fackler

View Replies !   View Related
Batch Insert 10000 Rows At A Time And Commit
I want to Insert 10000 rows at a time and commit in sql server , Isthere a way to do it if the source tables have no id fields ?What would be the most efficient method?ThanksAjay

View Replies !   View Related
Recommendation For Max Insert Commit Size
The DBA is not around and I would like to see if someone had a good recommendation on what the Maximum insert commit size (MICS) should be for an OLE DB Destination where the default of ZERO is not being used.
 
I want to use Fast Load and I want to use Redirect Row to catch the errors.  I just performed a test where the OLE DB Destination was NOT set to Fast Load - it took FOREVER and I cannot have this kind of performance.
 
I know that this may be totally dependent on what is being inserted, but is there any problem with just setting this value to say 800,000? -.
 
The destination SQL database's recovery mode is set to SIMPLE as it is not a transactional database.
 
Suggestions??  Thx

View Replies !   View Related
Replication Row Commit Size
Is there a way to get replication to commit records in batches instead of all at once??  I am in a 24/7 shop and some of my updates end up being thousands of rows and it locks the subscriber table for a few minutes sometimes. If I could get it to commit say every 1000 rows it might give me some relief in this area..

Or am I thinking about this wrong?? If this is possible, would it help at all...

I

View Replies !   View Related
How To Commit At Each N Rows
HI, I try to commit 100 000 rows in a single batch. Using an OLEDB adapter and fast load option into SQL Server, even though I set the rows per batch to 100000 and the max insert commit size to 100000, it seems that SSIS does not commit 100 000 rows at a time, it commits rows randomly at something 2000 to 5000 rows at a time.

Is there a way to tells SSIS to commit rows at a fixed number of rows?

Thank you,

Ccote

View Replies !   View Related
Maximum Table Size?
Hello,

Does anyone knows what the maximum table size is?
I'm asking this because I have a feeling that dbcc newalloc and dbcc checktable report more often errors on larger tables.
Just recently I received an error 2529 on a table with over 17 million rows and about 3.9 GB large.

Thanks,
Stef

View Replies !   View Related
Maximum Size Of OPENXML IN SP
 

How much data we can pass through as an XML Text into SP by the concept OPENXML

 

View Replies !   View Related
Maximum Possible Database Size
I posted this in another area and didn't get an answer, so maybe I posted it in the wrong place.  Forgive me if you've seen this twice.
 

I'm trying to figure out what the ultimate size limitation for a SQL 2005 Enterprise server is.  This document is helpful but I'm a bit confused:
 
http://msdn2.microsoft.com/en-us/library/ms143432.aspx
 
In the document, it says that the maximum database size is 524,258 terabytes; however, it also says that the maximum data file size--which I assume is the .MDF file--is 16 terabytes.  My question is, how can you create a 524,258 TB database if the maximum file size 16 TB?
 
Dumb question, I'm sure...please enlighten me!
 
Norm

View Replies !   View Related
Want To Commit Every 5000 Rows???? NEW To SQL7
I have attached a code that does a delete and insert from my staging database to my production database. It works fine, however it commits after every row. I would like to get it to commit every 5000 rows. Your help is appreciated.

Thanks,

Eli


ALTER PROCEDURE test_delete AS

DECLARE @evocxx_id char(10)

DECLARE existing_evocxx cursor for

SELECT a.id
FROM [stage_cds]..cds_evocxx a, [cds]..cds_evocxx b
WHERE a.id = b.id
OPEN existing_evocxx
fetch next from existing_evocxx into @evocxx_id
while @@fetch_status = 0

BEGIN

BEGIN

DELETE FROM [cds]..cds_evocxx
WHERE id = @evocxx_id

fetch next from existing_evocxx into @evocxx_id
END

END
close existing_evocxx
deallocate existing_evocxxt

View Replies !   View Related
Maximum Datafile/log File Size
Is there any limit to the maximum size of a datafile or transaction log you can have with SQL Server 2000 on Windows 2000. Also is there a maximum size that should be adhered to for performance and admin reasons ?.

View Replies !   View Related
How To Change The Maximum Allowable Size Of A Row In SQL 6.5
Hi,

Can anyone tell me how to solve the problem below while I make a query,
"Select * from table_name order by column_name",

Msg 416, Level 16, State 1
Create of work table failed because the row size would be 2298.
This exceeds the maximum allowable size of a row in a table, 2014.

View Replies !   View Related
Replication Maximum Buffer Size
Hi,

I'd like to replicate an SQL Server Database to an SDF file. For Simplicity I want to use the SQL Server 2005 Management Console. The Console reports that the maximum buffer size were to small. In the comment (c# code) I can see it is set to 512. How can I increase the value in the replication assistant?

Miroslaw

View Replies !   View Related
Cannot Set Maximum Database Size By DBPROP_SSCE_MAX_DATABASE_SIZE;
I try to limit the database size to 2MB by the following code, but it doesn't work, Could somebody help me on it?
Thanks a lot!
 
Part of my code is:
 
VariantInit(&dbprop[0].vValue);

VariantInit(&dbprop[1].vValue);

VariantInit(&dbprop[2].vValue);

VariantInit(&dbprop[3].vValue);

// Create an instance of the OLE DB Provider

//

hr = CoCreateInstance( CLSID_SQLSERVERCE_3_0,

0,

CLSCTX_INPROC_SERVER,

IID_IDBInitialize,

(void**)&pIDBInitialize);

if(FAILED(hr))

{

goto Exit;

}

// Initialize a property with name of database

//

dbprop[0].dwPropertyID = DBPROP_INIT_DATASOURCE;

dbprop[0].dwOptions = DBPROPOPTIONS_REQUIRED;

dbprop[0].vValue.vt = VT_BSTR;

dbprop[0].vValue.bstrVal = SysAllocString( DATABASE_LOG );

if(NULL == dbprop[0].vValue.bstrVal)

{

hr = E_OUTOFMEMORY;

goto Exit;

}

// Initialize property with open mode for database

dbprop[1].dwPropertyID = DBPROP_INIT_MODE;

dbprop[1].dwOptions = DBPROPOPTIONS_REQUIRED;

dbprop[1].vValue.vt = VT_I4;

dbprop[1].vValue.lVal = DB_MODE_READ | DB_MODE_WRITE;

// Set max database size

dbprop[2].dwPropertyID = DBPROP_SSCE_MAX_DATABASE_SIZE;

dbprop[2].dwOptions = DBPROPOPTIONS_REQUIRED;

dbprop[2].vValue.vt = VT_I4;

dbprop[2].vValue.lVal = 2; // 2MB

// set max size of temp. database file to 2MB

dbprop[3].dwPropertyID = DBPROP_SSCE_TEMPFILE_MAX_SIZE;

dbprop[3].dwOptions = DBPROPOPTIONS_REQUIRED;

dbprop[3].vValue.vt = VT_I4;

dbprop[3].vValue.lVal = 2; // 2MB

// Initialize the property set

//

dbpropset[0].guidPropertySet = DBPROPSET_DBINIT;

dbpropset[0].rgProperties = dbprop;

dbpropset[0].cProperties = sizeof(dbprop)/sizeof(dbprop[0]);

// Get IDBDataSourceAdmin interface

//

hr = pIDBInitialize->QueryInterface(IID_IDBDataSourceAdmin, (void **) &pIDBDataSourceAdmin);

if(FAILED(hr))

{

goto Exit;

}

// Create and initialize data store

//

hr = pIDBDataSourceAdmin->CreateDataSource( 1, dbpropset, NULL, IID_IUnknown, &pIUnknownSession);

if(FAILED(hr))

{

goto Exit;

}

View Replies !   View Related
What Is Maximum Database Size In The SSCE?
Hi,

 

I've found a two different answers for this question:

one - on the http://support.microsoft.com/Default.aspx?kbid=920700 site where on the Performance improvements section there is a 128MB value in the Database size.

other is in the product datasheet  there is a information that this version supports databases up to 4 GB.

 

Could you tell me what is the correct answer?

 

Regards,

Mariouche

View Replies !   View Related
Maximum SQL 2005 Database Size
 

Hello! I'm trying to figure out what the ultimate size limitation for a SQL 2005 Enterprise server is.  This document is helpful but I'm a bit confused:
 
http://msdn2.microsoft.com/en-us/library/ms143432.aspx
 
In the document, it says that the maximum database size is 524,258 terabytes; however, it also says that the maximum data file size--which I assume is the .MDF file--is 16 terabytes.  My question is, how can you create a 524,258 TB database if the maximum file size 16 TB?
 
Dumb question, I'm sure...please enlighten me!
 
Norm

View Replies !   View Related
Does The IDENTITY Field Type In SQL Have A Maximum Size To It?
Does the IDENTITY field type in SQL have a maximum size to it?
 
You know like int only goes so high up,

View Replies !   View Related
6.5 SQL:Updated Or Inserted Row Is Bigger Than Maximum Size
Microsoft OLE DB Provider for ODBC Drivers error '80040e14'

[Microsoft][ODBC SQL Server Driver][SQL Server]Updated or inserted row is bigger than maximum size (1962 bytes) allowed for this table.

HOW CAN I SOLVE THIS PROBLEM.PLEASE HELP!!!!!!!!!!!

View Replies !   View Related
Row Is Bigger Than Maximum Size (1962 Bytes)
Microsoft OLE DB Provider for ODBC Drivers error ' 80040e14'

[Microsoft][ODBC SQL Server Driver][SQL Server]Updated or inserted row is bigger than maximum size (1962 bytes) allowed for this table.

database:microsoft 6.5 SQL

How can I solve this problem
thanks,shay

View Replies !   View Related
Maximum Size Of The SSIS String Variable?
In SSIS, what is the maximum size of the String variable (one that you would define in the Variables dialog box)?
 
 

View Replies !   View Related
VS 2005 REPORT MODEL MAXIMUM ENTITY SIZE
Hi,

I'm opening a Report Model with 3mb size in a 256 mb memory and cannot load Report Model...Tried it on a 1.5 GIG memory and I was able to open Report Model.

But we need to open a 14 mb Report Model. When we tried to open this in our 1.5 GIG memory, we were not able to do so...Does CPU memory have to do with the size of our Report Model in loading it?

View Replies !   View Related
How To Know The Table Having Maximum Rows
Dear Experts,
i've one database with around 1400 tables.
is there any possibilities to know at a time what is the count(*) in each table? actually i need tables which are having maximum data.


my expected result is like this
table num_rows
table1 20000
table2 10000


like this
thank you very much

Vinod
Even you learn 1%, Learn it with 100% confidence.

View Replies !   View Related
Warning: The Table 'PropertyInstancesAudits' Has Been Created But Its Maximum Row Size (8190) Exceeds
Hi All
 
I am running a script which has a table creation. The table gets created, but with the below warning.
 

Warning: The table 'PropertyInstancesAudits' has been created but its maximum row size (8190) exceeds the maximum number of bytes per row (8060). INSERT or UPDATE of a row in this table will fail if the resulting row length exceeds 8060 bytes.
 
Structure is as under:




Code SnippetCREATE TABLE [dbo].[PropertyInstancesAudits] (
 [PIA_ClassID] [uniqueidentifier] NOT NULL ,
 [PIA_ClassPropertyID] [uniqueidentifier] NOT NULL ,
 [PIA_InstanceID] [uniqueidentifier] NOT NULL ,
 [PIA_Value] [sql_variant] NOT NULL ,
 [PIA_StartModID] [bigint] NOT NULL ,
 [PIA_EndModID] [bigint] NOT NULL ,
 [PIA_SuserSid] [varbinary] (85) NULL
) ON [PRIMARY]
GO
 
 


How should I get rid of this?

View Replies !   View Related
Maximum # Of Rows In SQL Server Express 2005
Hi,
 
I have a DB running for a while, but it reached 32767 rows and stopped.. The DB is only 20MB big. Where can i ajdust this setting to a bigger number of rows?
 
thxs.

View Replies !   View Related
Error: The Number Of Failing Rows Exceeded The Maximum Specified,Help !
HelloI am running a DTS (Sql 2000) and transfering data from an SqlServerdatabase to an Ingres database.I am transfering 153k rows.Execution fails with the error message:"The number of failing rows exceeded the maximum specified"The number of rows shouldn't be a problem because in another DTS I copya lot more rows.Also the width of the row shouldn't be a problem, I'm transfering only 3int4 fields and 1 char(5) field.Anyone have an idea what the problem could be ?ThanksDavid Greenberg

View Replies !   View Related
What Is The Maximum Number Of Rows Retrieve From A Table To Reports
hi friends,

                   i got a error while retrieving more than 100000 rows (records) from a table .. can any one tell me  what is the maximum number of rows retrieve from a database to reports... and how can i overcome this issue...

 

View Replies !   View Related
Performance Issues And Tuning (Rows Per Batch, MICS, Etc.)
I built an SSIS package to pull a large amount data (over 4 million records) from our legacy system into a SQL server for analysis.  As it stands the package takes somethign like 30+ hours to run.  I would really like to trim down the time it is taking.  The process runs monthly.
 
We're connecting to a cache database via an ODBC connection and dumping to an OLEDB connection to the SQL server.  Is there something I can change in the rows per batch, maximum commit size to improve performance.  It looks like the ODBC connection returns chunks of about 3000 rows at a time.
 
Is anyone familiar with cache that might know where I can get the best performing ODBC driver from?
 
What about the table that I'm dumping the data to?  I assume no (or very few) indexes is the way to go for speed of dropping the data into the table.
 
I don't feel that 30 hours is an acceptable amount of time for this process to run.  Am I off base here?
 
Any tips would be appreciated.

View Replies !   View Related
Trigger : Insert Maximum Primary ID
Hi,

via VBScript, I am inserting data into one table as below:
Code:

conn.Source = "INSERT INTO img (imageDesc,imageName,imageDate,imageUser,imageIP) VALUES ('"& ni &"','"& fn &"','"& Now() &"',"& Session("MM_UserID") &",'"& Request.ServerVariables("REMOTE_HOST") &"')"

In another table, I want to insert the primary key (imageID) of this newly inserted row into a new table called "t_image_Site" along with another value in another column.

Any advice/tutorials... this can be done with a trigger if I'm not mistaken?

JJ

View Replies !   View Related
What Is The Maximum Columns In INSERT Statement ??
Hello friends,

I am sorry to be posting this message in this forum. But thought if anyone does have any idea abt my problem, which i illustrate below.

I want to know what is the maximum number of columns that an INSERT statement can handle?

I have come across a situation where i need to insert 15 values in a table.

INSERT INTO myTable (col1, col2, col3,....., col15) VALUES (v1, v2, v3,...., v15).

Actual statement:

INSERT INTO BillSell (bnumber, stock, qty, price, tprice, seen, bill_id, brokerage, service_tax, brok_st, stt, brok_st_stt, total_cost, total_rate_pu, check)
VALUES ('ICICI 06-12-2004 reshma', 'RELIND', 10, 541.7375183271872, 5417.375183271873, 'no', 2402, 40.2375, 4.92507, 45.162569999999995, 6.771718979089841, 51.934288979089835, 5365.0, 536.5, 'check')


When I call this in Java program, it asks me to check the syntax of the statement as follows :

com.mysql.jdbc.exceptions.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'check) VALUES ('ICICI 06-12-2004 reshma', 'RELIND', 10, 541.7375183271872, 5417.' at line 1

Please make notice that all the data types of the table columns and the values passed in the statement match.

Now the problem is that, when from the above INSERT statement, i remove the last column (i.e. the "check" column and its corresponding value passed i.e. 'check'), the code works fine. I observe here that now there are 14 columns in the statement.

So i need the solutions for this. I cannot move further in the program......

Please help.

Trust in Technology mate .....'Not in Human Beings'

View Replies !   View Related
Batch Insert
I'm using ASP.net to do a Select and I want to insert all the results into a table that is stored locally
I can put an SP local but cant put on the other DB

How would i achieve this batch insert? is it possible?

thanks
Mark

View Replies !   View Related
Batch Insert (OLAP)
Hi can any one help me with the skeleton script (sample one)of running Bulk insert in batches........ I need to do it in batches as the input data is huge.....

The logic is I have to insert thru bcp in fact table...

After that batch execution for 50,000 thousand record.... wise.... if any of the batch failes i need to identify and have to rerun from that point onwards...... this is OLAP thing...

View Replies !   View Related
Cannot Allocate More Connection.connect Pool Is At Maximum Increase Max Pool Size
hi i'm having this error on my application"cannot allocate more connection.connect pool is at maximum increase max pool size" the proble is when i do testing this error does not apply it only Appears when the application is been used by many people
How can I resolve this?
Thanks
 

View Replies !   View Related
How To Speed Up Batch Insert MSSQL
Hi,
I am in the middle of writing a console application that acquires data from dynamic odbc connections and inserts the results into a MSSQL database. I'm currently using a datareader to generate multiple calls to a stored procedure and batch execute them by means of a simple counter; this works fine.
However, I'm a little concerned that it seems to take so long to execute the sql stored procs and was wondering if anyone may know of any methods to help speed it up either on the app side or sql, or both.
I had a quick word with our dba who spoke briefly about some kind of process where by the application fires the request across to the database and carries on leaving the db to queue the request or something to that effect. He ran away before I could get any sort of sense out of him.
 Any help greatly appreciated
 Thanks

View Replies !   View Related
Batch Insert Data Using Caching?
Hi all. I would like to collect data from many users in this way:Click link button in website A --> redirect to website B to call ADO.NET to insert one row of data from querystring --> auto redirect to website CUsing traditional insert SQL statment to insert data for each click by a user is workable, but it is poor performance when there are a lot of users clicking the link in website A. So can I using caching to store the data during the cache duration and perform batch insert after the cache duration expire?Thanks

View Replies !   View Related
JDBC Batch Insert + GetGeneratedKeys
I'm using MS JDBC driver to connect to SQLServer 2005 and trying to perform batch insert. Here is the code i'm using:




Code Snippet

         try {
            // Assume we have a valid con
            con.setAutoCommit(false);
           
            // this is the simplified SQL for the purposes of this example
            StringBuffer sql = new StringBuffer();
            sql.append("INSERT INTO temp_batchOpt (");
            sql.append("temp)");
            sql.append(" VALUES (?)");
           
            PreparedStatement insertStatement =
                    con.prepareStatement(sql.toString(),
                    SQLServerStatement.RETURN_GENERATED_KEYS);
           
            for (int i = 0; i < 10; i++) {
               
                insertStatement.setInt(1,3);
                insertStatement.addBatch();
            }
           
            int[] r = insertStatement.executeBatch();
           
            // the correct number of insertions are made
            if(r != null)
                for(int x: r)
                    System.out.println("inserted "+x);
           
            SQLServerResultSet keyRS = (SQLServerResultSet)insertStatement.getGeneratedKeys();
            while (keyRS.next()) {
                int id = keyRS.getInt(1);
                System.out.println("**filling id ="+id);
            }
            con.commit();
        } catch (BatchUpdateException bue) {
            bue.printStackTrace();
        }
        catch(SQLException sqle){
            sqle.printStackTrace();
        }
        finally {
            // close the connection...
        }

 

I'm getting the following exception:




Code Snippetinserted 1
inserted 1
inserted -2
inserted 1
inserted -2
inserted 1
inserted -2
inserted 1
inserted -2
inserted 1
com.microsoft.sqlserver.jdbc.SQLServerException: The statement must be run before the generated keys are available.
        at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(Unknown Source)
        at com.microsoft.sqlserver.jdbc.SQLServerStatement.getGeneratedKeys(Unknown Source)
        at com.x.U.m.Transformer.prefetchData(Transformer.java:285)
        at com.x.U.m.Transformer.init(Transformer.java:51)
        at com.x.U.m.XMLParse.main(XMLParse.java:215)

 

I can get the getGeneratedKeys() to work fine with single update (not usign batch update). Is batch + getGenerated key not supported?

TIA

View Replies !   View Related
Maximum Size For &"ntext&" Type In Sql Server
I couldn't insert more than 4000 charachters in "ntext" type in SQL server. Is there anyway, i can increase the size or any suggestions?

View Replies !   View Related
The Database File Is Larger Than The Configured Maximum Database Size.
I'm getting this error while trying to insert records into a SQL Server Compact Edition database. I have pasted my connection string that was used when creating the database as well as for accessing that same database from my Windows application.

Thanks for any help any of you can give!

Data Source=OnTheGo.sdf;Encrypt Database=True;Password=<password>;Max Database Size=4091

View Replies !   View Related
Using SSIS 2005 To Strip Out Bad Rows In Excel And Then Insert Detailed Rows Into OLE DB Data Source
Environment:
 
Running this code on my PC via VS 2005
.Net version 2.0.50727 on the server (shown in IIS)
Code is in ASP.NET 2.0 and is a VB.NET Console application
SSIS 2005
 
Problem & Info:
 
I am bringing in an Excel file.  I need to first strip out any non-detail rows such as the breaks you see with totals and what not.  I should in the end have only detail rows left before I start moving them into my SQL Table.  I'm not sure how to first strip this information out in SSIS specfically how down to the right component and how to actually code the component to do this based on my Excel file here: http://www.webfound.net/excelfile.xls

Then, I assume I just use a Flat File Source coponent or something to actually take the columns in the Excel and split into an OLE DB Datasource to shove each column into a corresponding column in my SQL Server Table.  I have used a Flat File Source in the past to do so with a comma delimited txt file but never tried with an Excel.
 
Desired Help:

 
How to perform
 
1)       stripping out all undesired rows
2)       importing each column into sql table

View Replies !   View Related
MS SQL Server DataDirect Problem FOr Batch Insert On SOLARIS
Batch Satement issued in c Code using SQLEXECDIRECT is
INSERT INTO DOCUMENT (RPT_VERSION_ID, RPT_AND_PAGE, RPT_ID, RPT_VERSION_CMP, STARTING_PAGE_NUM, STARTING_PAGE_RBA, PAGE_COUNT, SECTION_ID ) VALUES (?,?,?,?,?,?,?,?)


While getting trace from SQL Profiler we have found that in windows
Batch insert statement is converted fine as

INSERT INTO DOCUMENT (RPT_VERSION_ID, RPT_AND_PAGE, RPT_ID, RPT_VERSION_CMP, STARTING_PAGE_NUM, STARTING_PAGE_RBA, PAGE_COUNT, SECTION_ID ) VALUES (758477476,1252817530000000001,'AC001','7995988683 8883',1,58,8,'SALE') INSERT INTO DOCUMENT (RPT_VERSION_ID, RPT_AND_PAGE, RPT_ID, RPT_VERSION_CMP, STARTING_PAGE_NUM, STARTING_PAGE_RBA, PAGE_COUNT, SECTION_ID ) VALUES (758477476,1252817530000000009,'AC001','7995988683 8883',9,2360,6,'MANU')

but on SOLARIS SQL profiler shows the above statement as

INSERT INTO DOCUMENT (RPT_VERSION_ID, RPT_AND_PAGE, RPT_ID, RPT_VERSION_CMP, STARTING_PAGE_NUM, STARTING_PAGE_RBA, PAGE_COUNT, SECTION_ID ) VALUES (758474060,1252817530000000001,'AC001','7995988684 8579',1,58,8,'SALE'⤀   䤀䔀 䤀 䐀䌀e䴀䔀 ⠀t䔀䤀䤀䐀Ⰰ t䄀䐀t䄀䜀䔀Ⰰ t䤀䐀Ⰰ t䔀䤀䌀䴀tⰀ 䄀䤀䜀t䄀䜀䔀e 䴀Ⰰ 䄀䤀䜀t䄀䜀䔀䈀䄀Ⰰ t䄀䜀䔀䌀eⰀ 䔀䌀䤀䤀䐀 ⤀ 䄀䰀e䔀 ⠀758474060,1252 817530000000009,'AC001','79959886848579',9,2360,6, 'MANU'⤀ 



It seems to be a driver related problem (for Batch Inserts) and we are looking into it .




Anyone with solution??

View Replies !   View Related
ODBC Batch Insert Error On Windows 64 Bit Machine.
I€™m facing a SQL batch insert issue on Windows 64 bit (IA-64) platform. I€™m using €˜Column-Wise Binding€™ of parameters. The example program hosted on Microsoft site (http://msdn2.microsoft.com/en-us/library/ms709287.aspx) returns error of SQL_NEED_DATA for batch insert on Windows 64 bit platform. The same code works fine on 32 bit platform. I€™m using SQL Server 2000 as the database and tried using both native SQL server ODBC driver and Merant 5.2 ODBC drivers. Any help on this will be greatly appreciated.

View Replies !   View Related
Problem With Getdate() In Transaction Takes The Insert Time Instead Of The Commit Time
Hi,
 
We need to select rows from the database that have been recently inserted/updated.  We have a main primary table (COMMIT_TEST) and a second update table (COMMIT_TEST_UPDATE). The update table contains the primary key and a LAST_UPDATE field which is a datetime (to tell us when an update occurred).  Triggers on the primary table are used to populate the update table.
 
If we insert or update the primary table in a transaction, we would expect that the datetime of the insert/update would be at the commit, however it seems that the insert/update statement is cached and getdate() is executed at the time of the cache instead of the commit.  This causes problems as we select rows based on LAST_UPDATE and a commit may occur later but the earlier insert timestamp is saved to the database and we miss that update.
 
We would like to know if there is anyway to tell the SQL Server to not execute the function getdate() until the commit, or any other way to get the commit to create the correct timestamp.
 
We are using default isolation level.  We have tried using getdate(), current_timestamp and even {fn Now()} with the same results.  SQL Queries that reproduce the problem are provided below:
 

/* Different functions to get current timestamp €“ all have been tested to produce the same results */
/*
SELECT GETDATE()
GO
SELECT CURRENT_TIMESTAMP
GO
SELECT {fn Now()}
GO
*/
/* Use these statements to delete the tables to allow recreate of the tables */
/*
DROP TABLE COMMIT_TEST
DROP TABLE COMMIT_TEST_UPDATE
*/
/* Create a primary table and an UPDATE table to store the date/time when the primary table is modified */
CREATE TABLE dbo.COMMIT_TEST (PKEY int PRIMARY KEY, timestamp) /* ROW_VERSION rowversion */
GO
CREATE TABLE dbo.COMMIT_TEST_UPDATE (PKEY int PRIMARY KEY, LAST_UPDATE datetime, timestamp ) /* ROW_VERSION rowversion */
GO
/* Use these statements to delete the triggers to allow reinsert */
/*
drop trigger LOG_COMMIT_TEST_INSERT
drop trigger LOG_COMMIT_TEST_UPDATE
drop trigger LOG_COMMIT_TEST_DELETE
*/
/* Create insert, update and delete triggers */
create trigger LOG_COMMIT_TEST_INSERT on COMMIT_TEST for INSERT as
begin
        declare @time datetime
        select  @time = getdate()

 insert into COMMIT_TEST_UPDATE (PKEY,LAST_UPDATE)
 select PKEY, getdate()
 from inserted
end
GO
create trigger LOG_COMMIT_TEST_UPDATE on COMMIT_TEST for UPDATE as
begin
        declare @time datetime
        select  @time = getdate()

    update COMMIT_TEST_UPDATE
    set LAST_UPDATE = getdate()
    from COMMIT_TEST_UPDATE, deleted, inserted
    where COMMIT_TEST_UPDATE.PKEY = deleted.PKEY
end
GO
/* In our application deletes should never occur so we don€™t log when they get modified we just delete them from the UPDATE table */
create trigger LOG_COMMIT_TEST_DELETE on COMMIT_TEST for DELETE as
begin
 if  ( select count(*) from deleted ) > 0
 begin
     delete COMMIT_TEST_UPDATE
  from COMMIT_TEST_UPDATE, deleted
     where  COMMIT_TEST_UPDATE.PKEY = deleted.PKEY
 end
end
GO
/* Delete any previous inserted record to avoid errors when inserting */
DELETE COMMIT_TEST WHERE PKEY = 1
GO
/* What is the current date/time */
SELECT GETDATE()
GO
BEGIN TRANSACTION
GO
/* Insert a record into the primary table */
INSERT COMMIT_TEST (PKEY) VALUES (1)
GO
/* Simulate additional processing within this transaction */
WAITFOR DELAY '00:00:10'
GO
/* We expect at this point that the date is written to the database (or at least we need some way for this to happen) */
COMMIT TRANSACTION
GO
/* get the current date to show us what date/time should have been committed to the database */
SELECT GETDATE()
GO
/* Select results from the table €“ we see that the timestamp is 10 seconds older than the commit, in other words it was evaluated at */
/* the insert statement, even though the row could not be read with a SELECT as it was uncommitted */
SELECT * FROM COMMIT_TEST
GO
SELECT * FROM COMMIT_TEST_UPDATE


Any help would be appreciated, we understand we could make changes to the application/database to approximate what we need, but all the solutions have identified suffer from possible performance issues, or could still lead to missing deals (assuming the commit time is larger than some artifical time window).
 
Regards,
 
Mark

View Replies !   View Related
How To Write Query To Insert 10,000 Rows In A Table Using Insert Statement One Time
create table and inserting 10,000 row values at a time using single insert statment

ex: I want to create table employee having two coloumns
like employeeid , name

note : name can be static i.e : same name for 10,000 rows

employeeid is unique

can any one write a query for me?

View Replies !   View Related
Insert Rows With SQLServerce V3.5 Is Very Slow Can Anyone Help Insert Performance Poor
 

Hi All
 
I decided to change over from Microsoft Access Database file to the New SQLServerCe Compact edition. Although the reading of data from the database is greatly improved, the inserting of the new rows is extremely slow.
 
I was getting between 60 to 70 rows per sec using OLEDB and an Access Database but now only getting 14 to 27 rows per sec using SQLServerCe.
 
I have tried the below code changes and nothing seams to increase the speed, any help as I would prefer to use SQLServerCe as the database is much smaller and I€™m use to SQL Commands.
 
Details:
VB2008 Pro
.NET Frameworks 2.0
SQL Compact Edition V3.5
Encryption = Engine Default
Database Size = 128Mb (But needs to be changes to 999Mbs)
 
Where Backup_On_Next_Run, OverWriteQuick, CompressAns are Booleans, all other column sizes are  nvarchar and size 10 to 30 expect for Full Folder Address size  260
 
TRY1
 
Direct Insert Using Data Adapter.
 
Me.BackupDatabaseTableAdapter1.Insert(Group_Name1, Full_Folder_Address1, File1, File_Size_KB1, Schedule_To_Run1, Backup_Time1, Last_Run1, Result1, Last_Modfied1, Last_Modfied1, Backup_On_Next_Run1, Total_Backup_Times1, Server_File_Number1, Server_Number1, File_Break_Down1, No_Of_Servers1, Full_File_Address1, OverWriteQuick, CompressAns)
 
14 to 20 rows per second (Was 60 to 70 when using OLEDB Access)
 
 
TRY 2
 
Using Record Sets
 
Private Sub InsertRecordsIntoSQLServerce(ByVal Group_Name1 As String, ByVal Full_Folder_Address1 As String, ByVal File1 As String, ByVal File_Size_KB1 As String, ByVal Schedule_To_Run1 As String, ByVal Backup_Time1 As String, ByVal Last_Run1 As String, ByVal Result1 As String, ByVal Last_Modfied1 As String, ByVal Latest_Modfied1 As String, ByVal Backup_On_Next_Run1 As Boolean, ByVal Total_Backup_Times1 As String, ByVal Server_File_Number1 As String, ByVal Server_Number1 As String, ByVal File_Break_Down1 As String, ByVal No_Of_Servers1 As String, ByVal Full_File_Address1 As String, ByVal OverWriteQuick As Boolean, ByVal CompressAns As Boolean)
 
        Dim conn As SqlCeConnection = Nothing
        Dim CommandText1 As String = "INSERT INTO BackupDatabase (Group_Name, Full_Full_Folder_Adress, File1,File_Size_KB, Schedule_To_Run, Backup_Time, Last_Run, Result, Last_Modfied, Latest_Modfied, Backup_On_Next_Run, Total_Backup_Times, Server_File_Number, Server_Number, File_Break_Down, No_Of_Servers, Full_File_Address, OverWrite, Compressed) VALUES ('" & Group_Name1 & "', '" & Full_Folder_Address1 & "', '" & File1 & "', '" & File_Size_KB1 & "', '" & Schedule_To_Run1 & "', '" & Backup_Time1 & "', '" & Last_Run1 & "', '" & Result1 & "', '" & Last_Modfied1 & "', '" & Latest_Modfied1 & "', '" & CStr(Backup_On_Next_Run1) & "', '" & Total_Backup_Times1 & "', '" & Server_File_Number1 & "', '" & Server_Number1 & "', '" & File_Break_Down1 & "', '" & No_Of_Servers1 & "', '" & Full_File_Address1 & "', '" & CStr(OverWriteQuick) & "', '" & CStr(CompressAns) & "')"
        Try
            conn = New SqlCeConnection(strConn)
            conn.Open()
 
            Dim cmd As SqlCeCommand = conn.CreateCommand()
 
            cmd.CommandText = "SELECT * FROM BackupDatabase"
            cmd.ExecuteNonQuery()
            Dim rs As SqlCeResultSet = cmd.ExecuteResultSet(ResultSetOptions.Updatable Or ResultSetOptions.Scrollable)
 
            Dim rec As SqlCeUpdatableRecord = rs.CreateRecord()
 
            rec.SetString(1, Group_Name1)
            rec.SetString(2, Full_Folder_Address1)
            rec.SetString(3, File1)
            rec.SetSqlString(4, File_Size_KB1)
            rec.SetSqlString(5, Schedule_To_Run1)
            rec.SetSqlString(6, Backup_Time1)
            rec.SetSqlString(7, Last_Run1)
            rec.SetSqlString(8, Result1)
            rec.SetSqlString(9, Last_Modfied1)
            rec.SetSqlString(10, Latest_Modfied1)
            rec.SetSqlBoolean(11, Backup_On_Next_Run1)
            rec.SetSqlString(12, Total_Backup_Times1)
            rec.SetSqlString(13, Server_File_Number1)
            rec.SetSqlString(14, Server_Number1)
            rec.SetSqlString(15, File_Break_Down1)
            rec.SetSqlString(16, No_Of_Servers1)
            rec.SetSqlString(17, Full_File_Address1)
            rec.SetSqlBoolean(18, OverWriteQuick)
            rec.SetSqlBoolean(19, CompressAns)
            rs.Insert(rec)
        Catch e As Exception
            MessageBox.Show(e.Message)
        Finally
            conn.Close()
        End Try
    End Sub
 
€™20 to 24 rows per sec
 
TRY 3
 
Using SQL Commands Direct
 
Private Sub InsertRecordsIntoSQLServerce(ByVal Group_Name1 As String, ByVal Full_Folder_Address1 As String, ByVal File1 As String, ByVal File_Size_KB1 As String, ByVal Schedule_To_Run1 As String, ByVal Backup_Time1 As String, ByVal Last_Run1 As String, ByVal Result1 As String, ByVal Last_Modfied1 As String, ByVal Latest_Modfied1 As String, ByVal Backup_On_Next_Run1 As Boolean, ByVal Total_Backup_Times1 As String, ByVal Server_File_Number1 As String, ByVal Server_Number1 As String, ByVal File_Break_Down1 As String, ByVal No_Of_Servers1 As String, ByVal Full_File_Address1 As String, ByVal OverWriteQuick As Boolean, ByVal CompressAns As Boolean)
 
        Dim conn As SqlCeConnection = Nothing
        Dim CommandText1 As String = "INSERT INTO BackupDatabase (Group_Name, Full_Full_Folder_Adress, File1,File_Size_KB, Schedule_To_Run, Backup_Time, Last_Run, Result, Last_Modfied, Latest_Modfied, Backup_On_Next_Run, Total_Backup_Times, Server_File_Number, Server_Number, File_Break_Down, No_Of_Servers, Full_File_Address, OverWrite, Compressed) VALUES ('" & Group_Name1 & "', '" & Full_Folder_Address1 & "', '" & File1 & "', '" & File_Size_KB1 & "', '" & Schedule_To_Run1 & "', '" & Backup_Time1 & "', '" & Last_Run1 & "', '" & Result1 & "', '" & Last_Modfied1 & "', '" & Latest_Modfied1 & "', '" & CStr(Backup_On_Next_Run1) & "', '" & Total_Backup_Times1 & "', '" & Server_File_Number1 & "', '" & Server_Number1 & "', '" & File_Break_Down1 & "', '" & No_Of_Servers1 & "', '" & Full_File_Address1 & "', '" & CStr(OverWriteQuick) & "', '" & CStr(CompressAns) & "')"
 
        Try
            conn = New SqlCeConnection(strConn)
            conn.Open()
 
            Dim cmd As SqlCeCommand = conn.CreateCommand()
            cmd.CommandText = CommandText1
            'cmd.CommandText = "INSERT INTO BackupDatabase (€¦"
            cmd.ExecuteNonQuery()
           
        Catch e As Exception
            MessageBox.Show(e.Message)
        Finally
            conn.Close()
        End Try
    End Sub
 
€˜ 25 to 30 but mostly holds around 27 rows per sec I
 
Is this the best you can get or is there a better way. Any help would be greatly appericated
 
Kind Regards
 
John Galvin
 

View Replies !   View Related
Force An INSERT INTO To Insert Rows Orderly
Hi All,

From what I know, when you execute an
INSERT INTO table1 (field0)
SELECT field0 FROM table 2 ORDER BY field0
the rows are inserted in whatever order the server engine thinks is better at that moment. Is it any way I can have field0 in table1 inserted in the order I need?

My problem is that I can not touch table1, it is used by a legacy application which I am not allowed to modify. I was thinking about creating a clustered index, adding an id field, etc, but I can not know how this will affect the front end application.

table1 and table2 have only a few hundred records.

table1 =
CREATE TABLE [FinalPlasma] ([name] [varchar] (2000) NULL )

table2 =
CREATE TABLE [Test_FinalPlasma] ([nameID] [int] NOT NULL ,
[name] [varchar] (2000) NULL
)
The update that is not giving the "good" order =

DELETE FROM FinalPlasma
INSERT INTO FinalPlasma SELECT name from dbo.Test_FinalPlasma

Unfortunately I can not order the [name] field after the update, because it looks something like

Donald McQ. Shaver
Mark Sheilds
R.J. Shirley
W.E. Sills
Kenneth A. Smee
A. Britton Smith
LCol Edward W. Smith
Harry V. Smith
M. E. Southern
Timothy A. Sparling
Spectrum Investment Management Limited


Thank you,

View Replies !   View Related
Use Cursor To Update Rows In Batches Of A Given Size
I'd like to use a cursor to update a large number of rows in a table, updating a batch of 1000 records at a time. I tried updating in one transaction but log fills up. I'd like to update a batch, clear the log, then update another batch. Repeat till done. Don't really want to increase the log size for this one time shot. Can anyone give me an example to follow.
Thanks

View Replies !   View Related
COMMIT TRAN Does Not Commit
Microsoft SQL Server Management Studio Express

select @@trancount
begin tran
select @@trancount
use ProdNetPerfMon
select @@trancount
update Nodes set Caption = 'xxxx' where Vendor = 'yyyy'
select @@trancount
commit tran
select @@trancount

Executes as expected including trancount without errors.
Nodes.Caption is updated but reverts after a few minutes.

sa privileges

What am I missing?

View Replies !   View Related
...Contains More Than The Maximum Number Of Prefixes. The Maximum Is 3.
SQL Server 2000 8.00.760 (SP3)I've been working on a test system and the following UDF worked fine.It runs in the "current" database, and references another database onthe same server called 127-SuperQuote.CREATE FUNCTION fnGetFormattedAddress(@WorkID int)RETURNS varchar(130)ASBEGINDECLARE@Address1 As varchar(50)@ReturnAddress As varchar(130)SELECT@Address1 = [127-SuperQuote].dbo.tblCompany.Address1FROM[Work] INNER JOIN[127-SuperQuote].dbo.tblCompany ON [Work].ClientID =[127-SuperQuote].dbo.tblCompany.CompanyIDWHERE[Work].WorkID = @WorkIDIF @Address1 IS NOT NULLSET @ReturnAddress = @ReturnAddress + @Address1 + CHAR(13)+ CHAR(10)RETURN @ReturnAddressENDSo now the system has gone live and it turns out that the live"SuperQuote" database is on a different server.I've linked the server and changed the function as below, but I get anerror both in QA and when checking Syntax in the UDF builder:The number name 'Zen.SuperQuote.dbo.tblCompany' contains more than themaximum number of prefixes. The maximum is 3.CREATE FUNCTION fnGetFormattedAddress(@WorkID int)RETURNS varchar(130)ASBEGINDECLARE@Address1 As varchar(50)@ReturnAddress As varchar(130)SELECT@Address1 = Zen.SuperQuote.dbo.tblCompany.Address1FROM[Work] INNER JOINZen.SuperQuote.dbo.tblCompany ON [Work].ClientID =Zen.SuperQuote.dbo.tblCompany.CompanyIDWHERE[Work].WorkID = @WorkIDIF @Address1 IS NOT NULLSET @ReturnAddress = @ReturnAddress + @Address1 + CHAR(13)+ CHAR(10)RETURN @ReturnAddressENDHow can I get round this? By the way, I've rather simplified thefunction to ease readability. Also, I haven't posted any DDL because Idon't think that's the problem!ThanksEdward

View Replies !   View Related

Copyright © 2005-08 www.BigResource.com, All rights reserved