How To Do Bulk Update / Insert In A Table With Conditions
Oct 30, 2015
I have Three tables Student,Daily_Attendance_Master and Daily_Attendence_Details.
I want to run sql of insert or update of student attendence(apsent or present) in Daily_Attendence_Details based on Daily_Attendance_Master_Id and Student_Id(from one roll number to another).
If Both are present in table Daily_Attendence_Details then i want to run Updating of attendance from one roll number to another roll number in Daily_Attendence_Details on the basis of Daily_Attendence_Details_Id
And if both or any one is not present i want to run insert of student attendense from one roll number to another roll number in Daily_Attendence_Details.
I give below the structure of three tables Student,Daily_Attendance_Master and Daily_Attendance_Details.
I am currently working with C and SQL Server 2012. My requirement is to Bulk fetch the records and Insert/Update the same in the other table with some business logic? How do i do this?
I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.
Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?
Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!
I have a web form with a text field that needs to take in as much as the user decides to type and insert it into an nvarchar(max) field in the database behind. I've tried using the new .write() method in my update statement, but it cuts off the text after a while. Is there a way to insert/update in SQL 2005 this without resorting to Bulk Insert? It bloats the transaction log and turning the logging off requires a call to sp_dboptions (or a straight-up ALTER DATABASE), which I'd like to avoid if I can.
I am trying to do a simple update in one sql table from another where a certain condition in both tables are met.I am missing something basic and have tried numerous posted examples, without success.Here is my code (that does not update) :
INSERT INTO Users (userName, UserSalt, UserHash1, UserHash2, UT_memberID) select memberFirstName + '.' + memberLastName + '56' as userName, '{AxxxxxDE-6xx6-4xxD-Bxx9-3xxxx79xxxxE}', '{4xxxxxx6-8xx5-6xxD-Cxx6-4xxxFxxx1xx9}', '{0xxx8xxE-Cxx4-6xx8-ExxB-Dxxxx4xxx2xC}', members.memberID From members Inner Join groupLeaders ON members.memberID = groupLeaders.memberID SELECT @@Identity AS UserID
How can I modify the portion that is inserting the '56' at the end of each username to do the following:
1) check to see if username already exists in the database (using a query with "LIKE %'")
2) if not, create the username "as-is" or how it should be without the number
3) if already exists, get a count of records matching your search criteria .... now make a new username + + (count + 1).ToString();
Any thoughts... I am struggling to put these two pieces together.
Thanks,
Zoop
[EDIT - original post below this]
I have modified my method to make this a bit easier. I added a memberID field to my [Users] table so that I can update my [Members] table in a difference statement after the insert takes place.
I have the following query, and it completes succesfully in query analyzer (though I haven't actually executed the SP, just testing the syntax...) anyway, here is what I have:
Code:
INSERT INTO Users (userName, UserSalt, UserHash1, UserHash2, UT_memberID) select memberFirstName + '.' + memberLastName + '56' as userName, '{AxxxxxDE-6xx6-4xxD-Bxx9-3xxxx79xxxxE}', '{4xxxxxx6-8xx5-6xxD-Cxx6-4xxxFxxx1xx9}', '{0xxx8xxE-Cxx4-6xx8-ExxB-Dxxxx4xxx2xC}', members.memberID From members Inner Join groupLeaders ON members.memberID = groupLeaders.memberID SELECT @@Identity AS UserID
I am hoping this will create a user for all members whose 'memberID' can be found in the groupLeaders table... is this correct?
Also, notice the 56 being appended to the end of each username. I would like this to be a random number generated within a given range... can this be done? any advice?
Thanks,
Zoop
[Original post below - provide more background]
I have three tables involved with this insert/update:
I want to insert into the [Users] table the memberFirstName.memberLastName + randomNum into the 'UserName' column from the [Members] table. Also, I want to make all passwords the same, in this case I know the Salt, Hash1, Hash2 I will be using and would like to pass these in for the 'UserHash1' 'UserHash2' fields.
Now, I only want to make this insert where the memberID is in the GroupLeaders table. and Finally, I need to Update my Members table with a UserID where the memberID matches the one used from the groupLeaders table.
Does anyone have any ideas on how I can accomplish this, even if it requires adding a temporary field to one of my tables... here is what I have so far, but am recieving errors and can't quite figure this one out. (btw - I also don't know how to gen the rand num and was using the literal 23 as a placeholder.) Thanks...
Code:
INSERT INTO Users (userName, UserSalt, UserHash1, UserHash2) select a.memberFirstName + '.' + a.memberLastName + '23' + as userName, '{AA99FCDE-6E06-437D-B9E9-3E3D27955C3E}', '{7xxxxxx2-4xx6-9xx1-7xx9-4x3xx4Axxx59}', '{0xx8xxE-Cxx4-6xxx-xxxx-Fxx3xxxx3xxF}', b.memberID as newMemID From members a, groupLeaders b Where a.memberID = b.memberID SELECT @@Identity AS UserID
Update Members Set UserID = Ident_Current('Users') where memberID = newMemID
On a daily basis I need to perform a bulk update. Table totals about 50,000 records with approximately 5,000 changing (deletes, edits, and new records) per day. I'd like to push just the updates somehow, but VB is too slow and I haven't found a way in to handle it in DTS. Not much experience w/ DTS.
I'm transfering between two SQL 2000 servers w/ a VB app sitting in the middle.
I have a fundamental problem with how CDC works for bulk updates.When CDC enabled table is updated for single row - My CDC system tables its recording it as update (3 & 4) which is perfect and what it should be. No Complains!But when I do a bulk update in the same CDC enabled tables for the same columns - My CDC system tables its recording as delete and then insert (1 & 2). This is not correct and this is what my problem is. We used triggers before CDC we did not face this problem with triggers every thing was fine with triggers other than performance.The way how the CDC is handling the bulk update is a big problem for me because based on the output of CDC system tables we are doing some migration work to legacy system.
It will be impossible for me to go and change my migration logic scripts because we have 100's or procedures in it.Is it a know problem with CDC? Is there any solution in CDC when a bulk update happens on a table the CDC system tables record it as updates. I don't think CDC 'net changes' in this situation because the net change would show as single inserted row.If this can't be done with CDC then I have to completely abandon CDC and go back to triggers..
I am getting the below error message while performing Bulk Insert/Update operation.
Could not allocate space for object 'dbo.pros_mas_det'.'PK__pros_mas__3213E83F22401542' in database 'admin_mbjobslive' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.
My Current SQL Server version :
Microsoft SQL Server 2008 R2 (RTM) - 10.50.1600.1 (X64) Apr 2 2010 15:48:46 Copyright (c) Microsoft Corporation Express Edition with Advanced Services (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1) (Hypervisor)
My current database size crossed the limit size of 10 GB.
I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.
Hi guys, Consider this Scenario. I have two Tables. Table1-Users Fields are id, name,joindate,designation, status Table2-People Fields are id, name, status The table Users have data in it say 100 records I have to fill it toPeople Table where id=id and name=name and status=status Any Way? Regards, Naveen
I have a table that contains comma delimited text, and I am trying to convert this into another table
eg my target table looks like
Produce|Price|QuantityPerPrice
and my input table contains strings such as
"apples","7.5","10"
"pears","10","8"
"oranges","8","6"
Does anyone have any ideas on how to do this? I am after a solution that does them all at once: I am currently using charindex() to find each column, one at a time, but given the speed of BULK INSERT I would much rather do it as a table. The one solution that I don't want to resort to is to export the table with delimited strings to a data file, then BULK INSERT it...
This is in the context of an ETL process - loading large blocks of data.
I bulk insert a bunch of rows (could be millions, more likely 10's of thousands) into a table, perform some queries and then I need to append those rows into a second table and truncate the first table. From an efficiency standpoint, switching the load table into a partitioning scheme would be best, but I can't use partitioned tables for reasons not relevant here.
So, what's going to be the most efficient solution? I can easily do a simple insert into/select from to copy the rows, but that will be fully logged, and I'd really like a minimally logged solution. Looking at the docs for bulk insert/bulk copy, I can't see a solution that will copy data from one table to another, but I'm suspecting that I'm overlooking something. I could re-load the rows from the client using a second bulk copy, but that seems like a terrible waste (although the client is on the same box, and always will be, so it's not as bad as it might be).
My current project is creating a social network for the university I work for. One of the features allows members of a group to send a message to all other group members. Currently, I run a foreach loop over each of the group members, and run a separate INSERT statement to insert a message into my messages table. Once the group has several hundreds members, everybody starts getting timeout errors. What is the best way to do this? Here are two suggestions I've received: construct one sql statement that would contain multiple INSERT statements. It would be a large statement like: INSERT into [messages] (from_user, to_user, subject, body) VALUES (@from_user, @to_user, @subject, @body); INSERT into [messages] (from_user, to_user, subject, body) VALUES (@from_user2, @to_user2, @subject2, @body2); INSERT into [messages] (from_user, to_user, subject, body) VALUES (@from_user3, @to_user3, @subject3, @body3); etc... Or, do the foreach loop in a stored procedure. I know the pros and cons of sprocs versus dynamic sql is a sticky subject, and, personally, I'd prefer to keep my logic in the C# code-behind file. What is the best way to do this is an efficient manner? I'd be happy to share some code, if that would help. Thanks for your input!
Here, is the example of Bulk Insert into SQL Server Table. From Application you have to pass a XML string to a Stored Procedure and it will insert all data into table using that XML. Example SP.
CREATE PROCEDURE StoredProcName ( @strXML varchar(8000) ) AS Declare @intPointer int exec sp_xml_preparedocument @intPointer output, @strXML
I used bulk insert to insert a txt file into a table. It works fine. (see code below) Now, one txt file with column's name at first row and has about 200 columns. There is no table created before. How to code to create a destination table based on first row of the txt file so that bulk insert will work for that txt file?
BULK INSERT #MBRACCT FROM 'c:order.TXT' WITH ( FIELDTERMINATOR = '|', FIRSTROW = 2, ROWTERMINATOR = '' )
I have a file I'm trying to do some non-set-based processing with. Inorder to make sure I keep the order of the results, I want to BULKINSERT into a temp table with an identity column. The spec says thatyou should be able to use either KEEPIDENTITY or KEEPNULLS, but I can'tget it to work. For once, I have full code - just add any file of yourchoice that doesn't have commas/tabs. :)Any suggestions, folks?--create table ##Holding_Tank ( full_record varchar(500)) -- thisworkscreate table ##Holding_Tank (id int identity(1,1) primary key,full_record varchar(500)) --that doesn't workBULK INSERT ##Holding_TankFROM "d: elnet_scriptspsaxresult.txt"WITH(TABLOCK,KEEPIDENTITY,KEEPNULLS,MAXERRORS = 0)select * from ##Holding_tank
I am trying my first bulk update to an existing SWL table from a CSV text file,The text file naming is exacrtly the same as the SQL table, with the same attributes
The statements: BULK INSERT [Jedox_prod].[dbo].[B_BP_Customer] FROM 'c:Baanjedox_dailyjdcom4401.txt' WITH
[code]....
The error message is: [size=1Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 2, column 3 (BP_Country). Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error. Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".size=1]..The have checked and re-checked the BP_Country field ( the 1st field after the key) and I am not seeing any mismatches.
I have a bulk insert situation that would be nice to be able to pull off. I have a flat file with 46 columns that are to go into a table. The table, I want to have a 47th column to be updated later on by means of a stored proc saying if the import into the system was sucessful or not. I have the rowterminator set as '"' thinking that would tell SQL to begin on the next row, leaving the importstatus column null but i still receive an error.
First of all, is this idea possible within this insert statement. Secondly, if so, what would be the syntax to tell the insert statement to skip that particular column. It is the last column listed in the table so it just needs to start on the next row after it inserts the last bit of data in the flatfile.
If this is not possible, is it possible to bulk insert into a temp table?
I saved the result into a csv file and then truncated the table. Now, I am trying to bulk insert the data into the table. So I used:
bulk insert rdb.dbo.scd_event_tab from 'C:userssluintel.ctrdesktopeventtab.csv' with ( codepage = 'RAW', datafiletype = 'native', fieldterminator = ' ', keepidentity, keepnulls ); go
However, I get this error:
Msg 4867, Level 16, State 1, Line 1 Bulk load data conversion error (overflow) for row 1, column 1 (JOB_ID). Msg 4866, Level 16, State 5, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 3. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
I am running a set of SQL statements on a SQL server, to insert flat file data into a SQL table. The flat file is already FTP'ed to the SQL server. I seem to be getting an error, which is possibly pointing to a permissions issue
The statements:
BULK INSERT [Jedox_prod].[dbo].[B_BP_Customer] FROM 'c:jedox_dailyjdcom4401.txt' WITH ( FIRSTROW = 2, MAXERRORS = 0, FIELDTERMINATOR = '|', ROWTERMINATOR = ' ' ) GO
The error is : Msg 4861, Level 16, State 1, Line 1 Cannot bulk load because the file "c:jedox_dailyjdcom4401.txt" could not be opened. Operating system error code 3(failed to retrieve text for this error. Reason: 1815)
If it is permissions issue, how do I overcome this?
Hi all,We have an application through which we are bulk inserting rows into aview. The definition of the view is such that it selects columns froma table on a remote server. I have added the servers usingsp_addlinkedserver on both database servers.When I call the Commit API of oledb I get the following error:Error state: 1, Severity: 19, Server: TST-PROC22, Line#: 1, msg:SqlDumpExceptionHandler: Process 66 generated fatal exception c0000005EXCEPTION_ACCESS_VIOLATION. SQL Server is terminating this process.I would like to know if we can bulk insert rows into a view thataccesses a table on the remote server using the "bulk insert" or bcpcommand. I tried a small test through SQL Query Analyser to use "bulkinsert" on a such a view.The test that I performed was the following:On database server 1 :create table iqbal (var1 int, var2 int)On database server 2 (remote server):create view iqbal as select var1,var2 from[DBServer1].[SomeDB].[dbo].[iqbal]set xact_abort onbulk insert iqbal from '\MachineIqbaliqbaldata.txt'The bulk insert operation failed with the following error message:[Microsoft][ODBC SQL Server Driver][DBNETLIB]ConnectionCheckForData(CheckforData()).Server: Msg 11, Level 16, State 1, Line 0General network error. Check your network documentation.Connection BrokenThe file iqbaldata.txt contents were :112233If the table that the view references is on the same server then weare able to bulk insert successfully.Is there a way by which I should be able to bulk insert rows into aview that selects from a table on a remote server. If not then couldanyone suggest a workaround. I would actually like to know someworkaround to get the code working using OLEDB. Due to unavoidablereasons I cannot output the records to the file and then use bcp tobulk insert the records in the remote table. I need to have some wayof doing it using OLEDB.Thanks in advanceIqbal
Overall goal: Write a Bulk Insert statement using the UNC path of a filetable directory.
Issue: When using the UNC path of the filetable directory in a Bulk Insert Statement, receiving "Operating system error code 50(The request is not supported.)" Looking for confirmation as to whether this is truly not supported.
Environment: SQL Server 2012 Standard. Windows Server 2008 R2 Standard
Hi, I have a data file which consists of data as below, 4 PPU_FFA7485E0D|| T_GLR_DET_11||
While iam inserting into table using bulk insert, this pipe(||) is also getting inserted into the table, here is my query iam using to insert the data using bulk insert.
BULK INSERT TABLE_NAME FROM FILE_PATH WITH (FIELDTERMINATOR = ''||'''+',KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '''')
table2 is intially populated (basically this will serve as historical table for view); temptable and table2 will are similar except that table2 has two extra columns which are insertdt and updatedt
process: 1. get data from an existing view and insert in temptable 2. truncate/delete contents of table1 3. insert data in table1 by comparing temptable vs table2 (values that exists in temptable but not in table2 will be inserted) 4. insert data in table2 which are not yet present (comparing ID in t2 and temptable) 5. UPDATE table2 whose field/column VALUE is not equal with temptable. (meaning UNMATCHED VALUE)
* for #5 if a value from table2 (historical table) has changed compared to temptable (new result of view) this must be updated as well as the updateddt field value.