Help, The Value Could Not Be Converted Because Of Potential Loss Of Data

Jun 19, 2007

Hi,



I am getting the error below in my Flat File Source.



I've seen this error many times before, and have successfully resolved this problem in the past.



However, this time it's a little different. It's complaining about row 7 of myFile.csv, column 20. I have column 20 defined as a Numeric(18,6). It also maps to the Price field in the table, which is also a Numeric(18,6).



The problem is, on row 7 of myFile, column 20 is blank. That is, there's no data for row 7, column 20.



So, why should it care about this?? If it's blank, then how can you lose any data?? I have several other blank columns in this file, but they aren't throwing any errors. Just this one.



Thanks



Errors:



[Flat File Source - myFile [1]] Error: Data conversion failed. The data conversion for column "Column 20" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".



[Flat File Source - myFile [1]] Error: The "output column "Price" (333)" failed because error code 0xC0209084 occurred, and the error row disposition on "output column "Price" (333)" specifies failure on error. An error occurred on the specified object of the specified component.



[Flat File Source - myFile [1]] Error: An error occurred while processing file "d:myDirmyFile.CSV" on data row 7.

View 21 Replies


ADVERTISEMENT

The Value Could Not Be Converted Because Of A Potential Loss Of Data

Jun 13, 2008

I am using SQL Server 2005. 
I am trying to import data from CSV files into an SQL Server table using the Import wizard. The text qualifier is double quotes ("), column delimeter is a comma (,), first row has column names. One of the field name is "id", which is a GUID,  whose datatype in SQL Server is uniqueidentifier. It looks like this in the file:
..."data","data","dbf7edf8-0ca8-4e53-91e3-5901cdc1819a","data"...
As you can see, there are no enclosing curly braces for the guid value. The DTS chokes on this and throws this error:
The value could not be converted because of a potential loss of data
If I add curly braces like this {dbf7edf8-0ca8-4e53-91e3-5901cdc1819a}, it imports with no problem.
Is there way to import this type of data, because there is no way I can edit these files, and I would prefer not changing the datatype of the id field?
Or is this a limitation of SQL Server?
Thanks,Bullpit

View 5 Replies View Related

'could Not Be Converted Because Of A Potential Loss Of Data'

May 17, 2007

I have a FoxPro dbf that includes From Milepost (f_mp) and To Milepost (t_mp) fields. These fields contain values between -1 and 9999.9999.

I don't have FoxPro installed, but when I attach the dbf to Access, I see the fields defined as datatype Double.

I have a SQL table that I'm trying to import the dbf data into. In that table the two fields are defined as datatype Real.

When I execute the task, it fails at the first milepost value with 4 digits to the left of the decimal point.

I read up on datatypes, then redefined the milepost fields as Floats, but nothing changed.



Any ideas or suggestions would be greatly appreciated.



ginnyk

View 3 Replies View Related

The Value Could Not Be Converted Because Of A Potential Loss Of Data

May 30, 2006

During import from CSV i am getting following error "The value could not be converted because of a potential loss of data". My CSV file contains a column "years" and i have selected datatime in the import wizar. Can I scape from this error and import all my data.

Waqar

View 1 Replies View Related

The Value Could Not Be Converted Because Of A Potential Loss Of Data

Jan 29, 2008

I've searched the threads and didn't see anything which seemed to fit this specific issue....

I have a Data Flow task which reads from an OLE DB Source (SQL Server 2005), uses a Data Conversion transformation to convert some field values, and finally outputs the result to distinct tabs of an Excel workbook. The task is failing with the following error:



There was an error with input column "oBBCompanyName" (2162) on input "Excel Destination Input" (57). The column status returned was: "The value could not be converted because of a potential loss of data.".


Using the Advanced Editor for the Excel Destination component, I examined the datatype of oBBCompanyName (ID = 2162) in the Input Columns list of the "Excel Destination Input" (identified with ID = 57). The Data Type is defined as DT_WSTR with Length = 255. The ExternalMetaDataColumnID = 203.

Also in the Advanced Editor for the Excel Destination, I examined the datatype of BBCompanyName (ID = 203) in the Extranl Columns list of the Excel Destination Input. The Data Type is defined as "Unicode String [DT_WSTR] with Length = 255.

What could I be overlooking that might be the root cause of this issue? The same error is occurring for different Excel Destination tasks in the data flow.

Kind regards,
Orlanzo

View 4 Replies View Related

The Value Could Not Be Converted Because Of A Potential Loss Of Data

Dec 6, 2007



Hi

I am having a bit of a difficult trying to understand this one. I had imported two tables around 2-300 rows each ran this in 64 bit scheduled it and it ran okay. I now introduced a 3rd table which if I change the true64bit to false it will run however if left true it keeps mentioning the output column of descr with a loss of data.

I did move it to 32bit and then ran the package it comes up as below. If I rememeber I can run this in 32bit mode which I'm sure will work hmm maybe!! but what I can't understand is why it works for two tables? is it something to do with the translation of the table or do I need alter the select statement?


Currently it is a select *



Executed as user: jvertdochertyr. ...on 9.00.3042.00 for 64-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 10:30:11 PM Error: 2007-12-06 22:30:21.02 Code: 0xC020901C Source: st_stock st-stock out [1] Description: There was an error with output column "descr" (56) on output "OLE DB Source Output" (11). The column status returned was: "The value could not be converted because of a potential loss of data.". End Error Error: 2007-12-06 22:30:21.02 Code: 0xC0209029 Source: st_stock st-stock out [1] Description: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "output column "descr" (56)" failed because error code 0xC0209072 occurred, and the error row disposition on "output column "descr" (56)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure. End Error Erro... The package execution fa... The step failed.

View 14 Replies View Related

The Value Could Not Be Converted Because Of A Potential Loss Of Data In SSIS Package

Mar 3, 2006

With my SSIS package, I want to import data from a flat file (TXT- delimited with ?) to a table in my database in sql server 2005. The problem is that I have a column of type datetime in my table. But as you know, the data in txt is string. First I created my package through importing data and using import/export wizard in management studio to my database and selecting flat file connection. There, I selected my txt file and column delimiter as ?, then suggested types for the columns. There it selects 8 byte signed integer type for the datetime column in my table. After these steps I create my package and execute it. But it does not put data in my table in the database. It gives the error of "The value could not be converted because of a potential loss of data" or "cast conversion failed" . I tried other types of date, timestamp, string but none of them was successful. What should I do to put data in my table from txts. Please can you help me urgently!!!

Thanks a lot

View 12 Replies View Related

Date Field Problem - Value Could Not Be Converted Because Of A Potential Loss Of Data

Jun 26, 2007

Hi,



I have a flat file that has a date column where the date fields look like 20070626, for example. No quotes.



The problem is that several of the date values are missing, and instead of the date value the field looks like this , ,



That is, there are several blank spaces where the date should be. The number of blank spaces between the commas doesn't appear to be a set number (and it could even be 8 blank spaces, I don't know, in which case I don't know if checking for the Len will produce the correct results, but that's another issue...)



So, similar to the numeric field blanks problem, I wrote a script to convert the field to null. This is the logic I used:



If Not Len(Row.TradeDate) = 8 Then

Row.TradeDate_IsNull = True

End If



The next step in my data flow after the script is a derived column where I convert TradeDate from 20070625 to 06/25/2007. So the exact error message I am receiving is this:



[OLE DB Destination [547]] Error: There was an error with input column "TradeDate - derived" (645) on input "OLE DB Destination Input" (560). The column status returned was: "The value could not be converted because of a potential loss of data.".



Do I need to add a conditional split after the script and BEFORE the derived column to redirect bad rows so they don't go to the derived column?



What am I doing wrong here?



Thanks

View 9 Replies View Related

Flat File Source-The Value Could Not Be Converted Because Of A Potential Loss Of Data

Aug 7, 2007

Hello,
I have a flat file source with ragged right format. It has three columns.

My package fails when the last column has null values.
It says "The value could not be converted because of a potential loss of data".

So tried lot changing row delimiter and column delimiter but to no avail.

So any idea on this is well appreciated.

Thanks in advance.

View 3 Replies View Related

Invalid Character Value For Cast Specification. The Value Could Not Be Converted Because Of A Potential Loss Of Data..

Jan 30, 2008



Why would I get these errors:
" SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E21 Description: "Invalid character value for cast specification".

There was an error with input column "UniqueID" (3486) on input "OLE DB Command Input" (3438). The column status returned was: "The value could not be converted because of a potential loss of data.".

Error: 0xC0209029 : SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "input "OLE DB Command Input" (3438)" failed because error code 0xC0209069 occurred, and the error row disposition on "input "OLE DB Command Input" (3438)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure."


I read related posts but could not figure out the error?

View 3 Replies View Related

Flat File -&&> Table: Error Using I/E Wiz, Date..could Not Be Converted..potential Loss Of Data

Jul 16, 2007



The following error is encountered when importing a delimited flat file with date of fomat "dd.mm.yyyy"



Error: 0xC02020A1 at Data Flow Task, Source - DCDtest_xpt [1]: Data conversion failed. The data conversion for column "value date" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".



This was when I manually built the package.

I get the same error when using the Import/Export wizard



I am even using the "suggest types" button and have tried sampling the default number of rows (?200) and also 2000.



The type it suggests is DT_DATE.

But reading the BOL, this would appear to be the wrong type:

http://msdn2.microsoft.com/en-us/library/ms141036.aspx (obviously the right hand doesn't know what the left hand is doing)



seems to indicate that

DT_DBTIMESTAMP

is the correct value

(I cannot believe that they have different datatypes in SSIS than in SQL. I can't believe for a minute this is for all the hundreds of thousands of Oracle users who obviously switched to SSIS when they saw what a high quality product it is.)



I tried other DT_... values but no dice.



Can anyone help?





I always thought that Classic ASP was the worst product I've ever worked with from the Microsoft stable, but I was wrong.

I am fed up of having to post on this board (no wonder it is so 'popular')

Talking to peers, reading books, googling nearly always enables me to figure out a problem with any application I have ever used, but SSIS breaks the mould in sheer crapness and the weirdnes and unfathomability of its cryptic errors,.

Rant over (for today)

View 9 Replies View Related

SSIS - Data Conversion Failed - The Value Could Not Be Converted Because Of A Potential Loss Of Data.

Aug 3, 2006

Hello

 

I have an odd problem that is driving me nutz. I have a very simple SSIS package that imports a 5 colum flatfile into a sql Server 2005 Table.

When I created this package with the wizzard, it will execute perfectly fine and processes all rows into the destination table.

But when I hit F5 to execute it manually it will fail before inserting a single row.

 

The error it generates is (Spalte 5 is a Datetime in the format DD.MM.YYYY) :

Error: 0xC02020A1 at Datenflusstask, Source - Daten_NC_1_txt [1]: Data conversion failed. The data conversion for column "Spalte 5" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".

Error: 0xC0209029 at Datenflusstask, Source - Daten_NC_1_txt [1]: The "output column "Spalte 5" (25)" failed because error code 0xC0209084 occurred, and the error row disposition on "output column "Spalte 5" (25)" specifies failure on error. An error occurred on the specified object of the specified component.

Error: 0xC0202092 at Datenflusstask, Source - Daten_NC_1_txt [1]: An error occurred while processing file "C:WorkDaten_NC_1.txt" on data row 177.

 

 Edit: Modified the Title so it properly reflects the Problem & the Solution

View 3 Replies View Related

Potential Loss Of Data Error On Flat File Source

Nov 19, 2007



I'm getting a very strange potential loss of data error on my flat file source in the data flow. The flat file is fixed width and the column in question is defined as numeric [DT_NUMERIC]. The transform runs great if this column IS NOT A ZERO. As soon as a zero value is found, I get the error. It errors on the flat file source, so I haven't been able to use a data viewer to see what's going on.

Please Help!?

Thanks,
Scott Mescall

View 8 Replies View Related

Data Conversion Failed Due To Potential Loss Of Data

Aug 29, 2007



Hi,

I am getting this error when my ssis package is running

Data Conversion failed due to Potential Loss of data

the input column is in string format and output is in sql server bigint

the error is occuring when there is an empty string in the input. what should i do to overcome this

It is an ID field and should i convert to bigint or should i leave it as char datatype is it i a good solution or is there a way to over come this.

View 4 Replies View Related

Data Loss

Jul 20, 2005

Hi,I have a big problem with a database in MS SQL SERVER 2000.the rows into the some tables, for the second time, have been mixed betweenthey without appearing reason.the application that uses the db is totally TRANSACTIONAL and they do notexist query that they do not have clause WHERE.The database is on a computer with NAS architecture.I have this problem for the first time in 5 years of use of MS SQL SERVERCan someone help me?TIA

View 2 Replies View Related

Data Loss In Replication

Dec 6, 2006

hi,

Im using merge replication.I have created one publisher with dynamic filtering and i have creted two subscribers from it(pull subscription).It was working fine.but after sysnchronizing 2-3 times data from some tables went missing

can any one help me....

thanks and regards

Dhanya





View 3 Replies View Related

Mysterious Loss Of Data From Mssql2k

Jul 23, 2005

Hi All,I'm trying to track down a mysterious problem we're experiencing inwhich updates and inserts to tables in our mssql2k server appear to be'disappearing.'To explain our situation:We have a web page (written in ASP, if that's relevant) on which weaccept enrollment information.When that page is submitted, the form data is passed to a storedprocedure on our mssql2k server, which performs several operations,all of which are wrapped in a transaction.In particular, the stored procedure performs an update operation on arecord in one table (i'll call it TableA) and an insert into anothertable (TableB).If the procedure encounters a problem (ie after each update / insertoperation in the procedure we test for IF @@Error<>0) it performs arollback, performs a select similar to the one immediately below, andthen RETURNs.SELECT '1' as error, 'Unable to update TableA' as errormsgIf the procedure doesn't fail any of the @@Error tests, thetransaction is committed, and a membership number is SELECTed to bereturned.SELECT '0' as error, @memnum as membershipnumberThe @memnum variable is populated within the transaction.Back in the ASP page we test both for the proc returning an emptyrecordset, or for it passing an explicit value in the error field, andpush the page to an error page if either of these conditions are met.If, on the other hand, none of these conditions are met, and themembershipnumber field in the recordset is populated with a validmembership number, we push to a confirmation page.This confirmation page receives the membership number in a sessionvariable, performs a SELECT against TableB (the table that receivedthe insert during the proc) using that membership number in the WHEREclause, and the resultant recordset is used to populate theconfirmation details on that page. That recordset is also then used topopulate the details of a confirmation email, which is automaticallysent by the confirmation page.And now here's our problem: we've become aware of a handfull of peoplewho have gone through the enrollment process, have received theconfirmation email containing the information they supplied asexpected, but the data appears to be entirely missing from our tables.By that I mean that the record in TableA does not appear to have beenupdated (under normal circumstances that record should have hadseveral flags set, and several other fields updated with informationsupplied by the person enrolling), and the record in TableB does notappear to have been inserted.In essence, looking at our tables, it *feels* like the transaction inthe stored procedure for that particular enrollment hit a problem andwas rolled back. However, the evidence that we have in the form of theconfirmation email argues strongly that the data must have existed inour tables (particularly in TableB), if only for an unknown period oftime.We're kind of at our wit's end to work out what is going wrong withthese enrollments. From my understanding of transactions (and I couldwell be wrong) any changes to data (ie updates, inserts etc) containedwithin are essentially 'invisible' to any other operation (ie theSELECT that happens in the confirmation page) until the transaction iscommitted, implying that the effect of the update and insert shouldhave been 'permanently' successful if no error code is received and ifa valid membership number was returned. I ask, because someone in ourteam has suggested that maybe the operations in the transaction'lasted long enough' in the tables to have been visible for the SELECTon the confirmation page to have worked, but were then subsequentlyrolled back, explaining why the confirmation email is appropriatelypopulated and why the data then appears to be missing. However, as Isaid, this doesn't match my understanding of how transactions behave.Sorry for the length of this post, but I felt it was best to explainthis as best as I could.Does anyone have any advice they can give us on this situation? ie,are there any known problems with operations in transactions 'bleedingover' into tables, but then being rolled back at some later point?Does anyone have any thoughts or suggestions on how we can furtherdiagnose this issue?Truly, any help will be immensely appreciated...Thanks in advance,M Wells

View 2 Replies View Related

Data Loss When Reinitializing Subscription

Aug 22, 2007


we had setup merge replication on 2 db servers. For some reason, the subscription started failing a month back with the error " invalid object sysmergexxxx on the subscriber. I did a reinitialize and now all the changes on the subscriber which werent synced got deleted. I have tried all 3 log recovery tools with no luck. Is there any hope of recovering data. the last backup on the subscriber was a month ago.

View 1 Replies View Related

Data Loss In Transactional Replication

Apr 11, 2007

We have setup transactional replication across several databases using SQL Server 2000 spread across multiple sites in a fully connected network. There is one main table from which data is replicated from the publisher to the destination. Horizontal filtering is being used on this table to enable sending/routing of the records to the correct DB(site). It has been observed that the documents/records are getting lost between some sites. Say 10 documents are being sent fron the publishing database but only 5 are being received at the destination database although the sent history for all the 10 documents is available at the publishing database.



Can anyone guide on how to analyse and resolve this problem? Can unreliable network be the issue, If the network is not reliable and the connection is lost during replication how does replication ensure that no data is lost?

View 1 Replies View Related

Character Set Problem Data Loss

Oct 4, 2007



Hi,

I am transferring data from SQL Server 2005 to SQLAnywhere 9. Dates and numbers are sucessfully transferred but chars and varchars are not coppied. Though there are values in SQL Server 2005 table they are shown as blank on SQL anywhere. I cannot undestand data loss

Is this a code page problem or data size? for example a varchar 20 data type is in both sql server and sql anywhere Sybase. SQL server is in 1252 and sql anywhere 1253


Thanks.

View 6 Replies View Related

Data Loss With Merge Replication?

Jun 27, 2006

We had merge replication setup between 2 tables, Table A and Table B using SQL 2000. This was working 100%. The users asked to disable updated/deletes to both these tables if data existed on 2 other tables. Table AA and Table BB. We implemented it as follows:

1) Created Insert/Update/Delete triggers for Table A & B. It basically check for Table A is there a record in Table AA, if it exists, raise an error and don€™t commit.

2) Removed all foreign constraints from Table AA and BB

3) Added Table AA and BB to the current replication.

Then all hell broke loose, we got conflicts all other the place saying that Table AA cannot be updated because records does not exist in Table X. To our surprise we found triggers generated by Erwin in 1998 €“ that check for €œforeign contsraints€? and removed them immediately.

We continued to get conflicts but could see from the error messages it was generated by the triggers in point 1. We added the NOT FOR REPLICATION clause and everything has been running smoothly or so we thought€¦..

After 2 months we got a call that data is missing. It€™s random data and the only explanation I have is that replication caused that. My biggest reason for saying this is tracking the application audit trail I€™ve found that all the data missing was added during the period we had all the conflicts.

I need a solid explanation for this and can anyone confirm that this is possible?

View 4 Replies View Related

How To Rebuild SQL Server Machine Without Any Data Loss

Mar 26, 1999

We wanted to rebuild our main SQL server machine. How can we backup everything about the SQL server (such as all databases/objects and settings on security and users) and then recover them without any data loss?
A related question is how to recover the server machine in case of system failure or whole machine crash down?
Thanks!

View 1 Replies View Related

More On Mysterious Data Loss In Mssql2k - A Possible Solution?

Jul 23, 2005

Hi All,Further to my previous long-winded question about a situation in whichwe appear to be mysteriously losing data from our mssql2k server.We discovered an update statement, in the stored procedure we believeis at fault, after which no error check was being performed.Under certain conditions, this update is fired against the same recordin the same table as the immediately preceding update statement withinthe transaction. We are now suspecting that under some circumstances,these two updates get into a locking conflict that is eventuallyforcing the transaction to be rolled back.However, I'm still left with three questions.1) Where an update in a transaction gets locked, and an error isn'ttested immediately afterwards (ie no 'IF @@Error<>0' test is made),would the transaction proceed as normal?2) Most critically, would statements in the stored procedure thatappear after the COMMIT TRAN statement also be executed, even if anunresolved lock existed within the transaction?3) Assuming that (2) does happen, would a SELECT made on anotherconnection with a 'WITH(NOLOCK)' locking hint be able to see thechanges made in the locked transaction even if the server is set toREAD COMMITTED, and the SELECT takes place some time after the COMMITTRAN is issued? More to the point, given (2), how long would thelocked transaction survive before being rolled back after the COMMITTRAN has been issued? Is it possible that the COMMIT TRAN takes place,the transaction is flagged for potential rollback while a lockresolution is attempted, the stored procedure exists as thougheverything was fine, a subsequent SELECT (ie performed as one of thenext operations in the same application) using WITH(NOLOCK) 'sees' thechanges made by the transaction, reinforcing the impression that thetransaction succeeded, and then at some point thereafter the lock isdetermined to be unresolvable and the transaction is rolled back,making it seem as though the data disappeared, even though it had beenSELECTable via a different connection to the server?Thanks, by the way, to Simon and Erland for your advice on my previousquestions about this problem.Much warmth,M Wells

View 1 Replies View Related

Backing Up Online Log After Loss Of Primary Data File

May 24, 2004

My question is this:

Is there anyway to backup the current online log after complete loss of the current correponding datafile?

Example:

(2) Logical Disk Volumes

Disk 1 (D:) contains pubs_data.mdf
Disk 2 (E:) contains pubs_log.ldf

Disk 1 becomes corrupt and goes offline leaving the the database pubs in a suspect state. Is backup of the current online log pubs_log.ldf possible? If a backup of the log is not possible are there any other restoration methods that can be used to bring this database back online, rescuing as much information as possible from the current online log pubs_log.ldf.

Thanks,
P

View 1 Replies View Related

XML Source To Data Conversion To OLE DB Destination - Loss Of CDATA

Apr 9, 2007

I'm trying to build a DTSX package that FTP's an XML file to the local file system and then imports it into an existing table.

My "Data Flow" for the package starts with an XML Source component and then goes to Data Conversion component and then to an OLE DB Destination component.

It all executes with out error and seems to work fine, but when I look at the data in the table after I've run the package it seems to have inserted the appropriate number of rows from the XML file but all of the column values are NULL.

All of the data in the XML file is surrounded by <![CDATA[ ]]> and I discovered that if I remove the CDATA wrapper by hand then it inserts the data properly. The only problem is that I'm not in a position to have the data provider remove the CDATA tags and some of the data in the XML file needs the CDATA wrapper or else it will not validate.

Anyone know of anything I can do to get the CDATA to import properly?

View 15 Replies View Related

SQL Server 2008 :: Initialize Transactional Publication From Backup Loss Of Data

Mar 18, 2015

I have automated process, which synchronizes a transactional publication using initialize from backup approach. It drops subscriptions and puts them back again once the restore on the subscriber is completed.

Dropping the subscriptions causes a lot of blocking and deadlocking. I've decided to remove those steps, but it causes loss of data on the subscriber.

Is it a must to drop and re-create the subscriptions during such process? If not, how can I avoid the loss of data?

View 0 Replies View Related

BC30311: Value Of Type 'System.Data.SqlClient.SqlDataReader' Cannot Be Converted To 'String'.

Jan 4, 2005

Does anyone know what the problem is?


Sub Page_Init(sender As Object, e As EventArgs)
Dim txtName As New TextBox()
Dim txtPart As New TextBox()
Dim txtEach As New TextBox()
Dim txtTotal1 As New TextBox()
Dim txtSubtotal As New TextBox()
Dim txtTax As New TextBox()
Dim txtShipping As New TextBox()
Dim txtTotal2 As New TextBox()

Dim prod_id As Integer
Dim id As Integer
id = Request.Querystring("prod_id")
txtName.Text = GetName(id).................this is the problem line

phName.Controls.Add(txtName)
phPart.Controls.Add(txtPart)
phEach.Controls.Add(txtEach)
phTotal1.Controls.Add(txtTotal1)
phSubtotal.Controls.Add(txtSubtotal)
phTax.Controls.Add(txtTax)
phShipping.Controls.Add(txtShipping)
phTotal2.Controls.Add(txtTotal2)

End Sub


I'm trying to populate dynamically rendered textbox's, how should I do that too?

View 1 Replies View Related

Access Data With Clipper Code That Has Been Converted 32 Bit using Flagship Product

Aug 26, 2015

who has converted dBase DBF files and Clipper index files (NTX) into SQL table(s) and continued to access the data with Clipper code that has been converted 32 bit using the  Flagship product.

View 3 Replies View Related

Sqlbulkcopy Error : The Given Value Of Type SqlDecimal From The Data Source Cannot Be Converted To Type Decimal Of The Specified

Apr 16, 2008

Hi,

The table in SQL has column Availability Decimal (8,8)

Code in c# using sqlbulkcopy trying to insert values like 0.0000, 0.9999, 29.999 into the field Availability
we tried the datatype float , but it is converting values to scientific expressions€¦(eg: 8E-05) and the values displayed in reports are scientifc expressions which is not expected
we need to store values as is


Error:
base {System.SystemException} = {"The given value of type SqlDecimal from the data source cannot be converted to type decimal of the specified target column."}

"System.InvalidOperationException: The given value of type SqlDecimal from the data source cannot be converted to type decimal of the specified target column. ---> System.InvalidOperationException: The given value of type SqlDecimal from the data source cannot be converted to type decimal of the specified target column. ---> System.ArgumentException: Parameter value '1.0000' is out of range.
--- End of inner exception stack trace ---
at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaData metadata)
--- End of inner exception stack trace ---
at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaData metadata)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal()
at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount)
at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table, DataRowState rowState)
at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table)
at MS.Internal.MS
COM.AggregateRealTimeDataToSQL.SqlHelper.InsertDataIntoAppServerAvailPerMinute(String data, String appName, Int32 dateID, Int32 timeID) in C:\VSTS\MXPS Shared Services\RealTimeMonitoring\AggregateRealTimeDataToSQL\SQLHelper.cs:line 269"


Code in C#

SqlBulkCopy bulkCopy = new SqlBulkCopy(sqlConnection, SqlBulkCopyOptions.Default);
DataRow dr;
DataTable dt = new DataTable();
DataColumn dc;

try
{

dc = dt.Columns.Add("Availability", typeof(decimal));
€¦.

dr["Availability"] = Convert.ToDecimal(s[2]); ------ I tried SqlDecimal
€¦€¦€¦.

}
bulkCopy.DestinationTableName = "dbo.[Tbl_Fact_App_Server_AvailPerMinute]";
bulkCopy.WriteToServer(dt);



thx



View 8 Replies View Related

Unable To Insert Converted Date Into Date Column (Data Type)

Aug 24, 2015

PHP Code:

INSERT INTO [GPO].dbo.tblMetric  (KPI_ID, METRIC_ID, GOAL, REPORTING_MONTH, ACTUALS) 
SELECT 
    
      1 AS KPI_OWNER_ID
    , 23 AS METRIC_ID 
    , .75 AS GOAL 
    , CAST(Z.REPORTING_MONTH as DATE) AS REPORTING_MONTH
    , SUM(CAST(FTP_COUNT AS DECIMAL))/SUM(CAST(FULL_COUNT AS DECIMAL)) AS ACTUALS

[Code] ....

The insert column I am trying to get into is a date type. The original state of the field is YYYYMM varchar. How to get this into the table.

View 3 Replies View Related

Identify Potential Duplicate

Apr 22, 2015

There is one report to identify potential duplicate in a table and it is performing poor.I'm now tuning the existing SP and got struck in modifiying it. rewrite the query in a best way. I just pasted below an example of query which is now in a report.The report will be run every week currently the table has 10 million records, and every week there will 5k to 10k will be added up so with that 5k to 10 k we have to check all the 10 miilion rows that if it is duplciated the logic is (surname = surmane or forename = forename or DOB =DOB )

Create table #employee
(
ID int,
empid varchar(100),
surname varchar(100),
forename varchar(100),
DOB datetime,
empregistereddate datetime,
Createdate datetime

[code]...

View 7 Replies View Related

Potential SQL Server Express Bug.

Apr 19, 2007

Hi all,

I have run into what I think is a bug in SQL Server Express version 2005.90.3042.0

To reproduce it, create a simple table as described below:

CREATE TABLE [dbo].[test](
[priority] [tinyint] NOT NULL,
CONSTRAINT [PK_test] PRIMARY KEY CLUSTERED
(
[priority] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
OK, a simple table with one tinyint column. It doesn't actually matter if you create the primary key or not.

Then I populated the table with 10 entries, with the numbers 1 - 10 in the priority column.

So the query
SELECT *
FROM test

produces

priority
1
2
3
4
5
6
7
8
9
10

Now I created a view using the following SQL

CREATE VIEW [dbo].[showtestd]
AS
SELECT TOP (100) PERCENT dbo.test.*
FROM dbo.test
ORDER BY priority DESC

If then run the following

SELECT *
FROM showtestd

I would expect the see the result ordered by descending index. However, the DESC in the view is ignored, it comes out sorted ascending, So I still see 1-10.

So I am hoping someone can tell me if this is a real bug, or if I misunderstand views.

View 4 Replies View Related

Potential Generate DDL Script

Nov 13, 2007

Hello,

This procedure when accepting a table name as parameter will produce a Create Table statement, INsert INto statement, and produce the top 10 records from a table in a suitable format for posting.



SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

CREATE Procedure [dbo].[pGenerateDDL](@Tablename varchar(128))
AS
Declare@Structure varchar(8000),
@colstr varchar(8000)

Set NOCOUNT on

Select @colstr = ''
If exists (select * from sysobjects where name = 'cols')
Drop Table cols

Create Table cols (
ColInfo varchar(500) null
)
Insert Into Cols (ColInfo)
Select '[' + RTRIM(C.name) + '] '
+ Case When isComputed = 0 then
LEFT(CASE
WHEN (T.name IN ('char', 'varchar', 'nchar','nvarchar')) THEN T.name + '(' + LTRIM(RTRIM(STR(C.length))) + ')'
When t.name in ('numeric','decimal') then t.name + '(' + Cast(c.prec as varchar) + ','+ cast(c.scale as varchar) + ')'
else t.name
END,30) else 'AS ' end
+

Case when isnullable = 1 and iscomputed = 0 then ' NULL'
When isnullable = 0 and iscomputed = 0 then ' NOT NULL'
end
--Cast((SELECT TOP 1 value
-- FROM ::fn_listextendedproperty(null, 'user', @user, @type, @tablename, 'column', default)
-- WHERE objname = C.name) as varchar)
+
Case When c.colid = (Select max(c.colid) maxid
FROM sysobjects o left JOIN syscolumns c ON (o.id = c.id)
left JOIN systypes t ON (c.xusertype = t.xusertype)
WHERE o.name = @tablename
) then ')' else ',' end
FROM sysobjects o inner JOIN syscolumns c ON (o.id = c.id)
inner JOIN systypes t ON (c.xusertype = t.xusertype)
WHERE o.name = @tablename

---
Declare colcur Cursor
READ_ONLY
FOR
Select Cast(Colinfo as varchar(500))
FROM cols

OPEN ColCur
FETCH colcur into @structure
IF (@@FETCH_STATUS <> 0)
BEGIN -- No matching objects
CLOSE TableCursor
DEALLOCATE TableCursor
END

WHILE (@@FETCH_STATUS = 0)
BEGIN
Select @colstr = @colstr + '
' + cast(@structure as varchar(500))
FETCH colcur INTO @structure
END
CLOSE colcur
DEALLOCATE colcur

If exists (select * from sysobjects where name = 'cols')
Drop Table cols
PRINT: '--TABLE STRUCTURE BELOW'
Print: 'Create Table ' + @TableName + '(' + @colstr

DECLARE @SQLstring varchar(8000)
DECLARE @SELString varchar(8000)
DECLARE @firstTime bit

SELECT @SQLstring = ''
SELECT @firstTime = 1
Select @SELstring = ''

DECLARE getColumnsCursor CURSOR
READ_ONLY
FOR
SELECT c.COLUMN_NAME
FROM INFORMATION_SCHEMA.COLUMNS c
WHERE c.TABLE_SCHEMA = 'dbo' AND
c.TABLE_NAME = @Tablename


DECLARE @columnName nvarchar(128)
OPEN getColumnsCursor

FETCH NEXT FROM getColumnsCursor INTO @columnName
WHILE (@@FETCH_STATUS <> -1)
BEGIN
IF (@@FETCH_STATUS <> -2)
BEGIN
IF (@firstTime = 0)
SELECT @SQLstring = @SQLstring + ',
'

-- append our column to the UPDATE statement
SELECT @SQLstring = @SQLstring + '[' + @columnName + ']'
Select @SELString = @SELString + 'Cast('+ '[' + @columnName + '] as varchar) +'',''+'
SELECT @firstTime = 0
END
FETCH NEXT FROM getColumnsCursor INTO @columnName
END

CLOSE getColumnsCursor
DEALLOCATE getColumnsCursor

Print: '--BELOW IS INSERT STATEMENT.'

Select @SQLString =
'Insert Into '+ @Tablename + ' ('+ @SQLString + ')'

Select @SELString = 'Select TOP 10 ''Select '' +' + Left(@SELstring,len(@SELstring)-4) + ' +'' UNION ALL'' FROM ' + @Tablename

Print (@SQLstring)
Print ''

Exec (@SELstring)

GO

SET ANSI_NULLS OFF
GO
SET QUOTED_IDENTIFIER OFF
GO




In my case, I tested on a small table and it produced this in the messages window:


--TABLE STRUCTURE BELOW
Create Table ALI_ExpectedLR(
[ALI_Score] int NULL,
[ALIExpectedLR] numeric(10,5) NULL,
[ALIExpectedLR_Unit] numeric(10,5) NULL)
--BELOW IS INSERT STATEMENT.
Insert Into ALI_ExpectedLR ([ALI_Score],
[ALIExpectedLR],
[ALIExpectedLR_Unit])

Select 141,0.04086,0.07329 UNION ALL
Select 142,0.04013,0.07217 UNION ALL
Select 143,0.03940,0.07106 UNION ALL
Select 144,0.03868,0.06996 UNION ALL
Select 145,0.03797,0.06888 UNION ALL
Select 146,0.03727,0.06782 UNION ALL
Select 147,0.03658,0.06676 UNION ALL
Select 148,0.03589,0.06572 UNION ALL
Select 149,0.03522,0.06470 UNION ALL
Select 150,0.03455,0.06368 UNION ALL




I am not sure if it is "handy" or not, but was an interesting thing to work on. It is an extension of script added to madhivans post on Generating script (as an alternative). Without writing another whole cursor, I didn't see an easy way to eliminate the last "UNION ALL"

If I am missing some fundamental logic or best practice that can be improved on, please make whatever changes are best.

If it is useful, let me know that as well.

Note: removed a bit that identified calculated columns for this purpose, as for the intended use, it isn't necessary.





Poor planning on your part does not constitute an emergency on my part.

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved