I am running SQLServer 2000 to parse and store records in the EDIX12
format. This consists of variable length delimited records which
I am passing to the "transforms" tab to process with VBScript.
The problem is though each segment has a defined number of fields, N,
the standard states that if the final M fieds are empty/blank they are
not to be sent. Thus, a segment defined to have 20 fields may
have 6 the first time I see it, 13 the next time, etc. To access
the columns in VBScript I use DTSSource("Col001"). This works as
long as the columns are there, but gives an error when they are
not. Is there a parameter telling me how many columns are
defined? Or is there something akin to IFEXISTS("Colxxx") or
exceptions?
How can I handle this situation? One suggestion has been to pass
the entire segment to the Transforms section and break it up there.
Finally, what resources can yuo point me to for reference? I'd
like to get good at using DTS since my client wants their project
written for it.
parsing any delimited string (in above example it is using ',' as parsing delimiter. This query can be useful in many business scenarios where in we have input data as a long string containing delimited values.
I have a tab delimited file with 122 columns. Can any one let me know if there is a better way of parsing/extracting few columns (say about 15) from the file and loading it into a table using SSIS.
Does anyone know how to parse an input variable to a stored proc? For example, if I have an input variable that is: 'ford,chevy,pontiac' how do I parse through this variable and handle each value independently?
Since I do not know how many values may exists I can't do it based on a set amount of input variables.
Hi, I want to export data/records coming from the database and save it as a .txt file but tab-delimited. The flow of my project is something this.
Web Form->SQL Database->Web Report->Tab-Delimited file.
I will explain more..What we want to do is an online application form. We have a form and will save all the data to sql server database. We also want to save all those information in a tab-delimited file. I would like to save this first in the database(no problem in this part). Then later on export this in tab-delimited file.
If you can give me a little bit tutorial of this i really appreciated..Even 3 records can do as long i can see how to do this..Ooops btw, i also want to name the .txt file as (userid+transactionid).
we can use 'sp_executesql' to execute any statemens. I have made a search and people, seems, need the dynamic sql only to process some table/cloumn unknown in advance. My idea is that the dynamic SQL feature is ideal for passing blocks of code (aka delegates). Particularily, you may require to execute different procedures under some acquired locks. A procedure would acquire the locks, execute the code and release the locks. The problem is, however, that I cannot find the specification for the variable length parameters. It seems not feasible for SPs. Nevertheless, the 'sp_executesql itself does accept the variable number of parameters. How? Can we look at the defenition?
I need to import an ASCII tab delimited file that has roughly 5,000 recordsonce a week into a SQL Server table. I have researched BCP and it seemslike the way to go. Am I headed in the right direction?Thanks in advance,James
I am writing a package that will process delimited flat files that will come in one of a few different versions. Within each flat file, the number of delimited columns will be the same, but each version of the file has a different number of columns. I have tried configuring the flat file data source to expect the version with the largest number of columns, but it will then throw away rows that have less than this number of columns (warning: There is a partial row at the end of the file).
Is it possible to use a single flat file data source that will work with all of the different width files?
I'm new to SQL Server. I installed a copy of 6.5 on my server and set it up today. I received a db from a colleague and have been unable to find out what the variable types and lengths are. This may be very easy but I need to know.
Hi,A query is exceeding the length of varchar and nvarchar variable.Because I'm picking the data from each record from table and giving itto the query.suggest me some way to do it.sample query:SELECT P1.*, (P1.Q1 + P1.Q2 + P1.Q3 + P1.Q4) AS YearTotalFROM (SELECT Year,SUM(CASE P.Quarter WHEN 1 THEN P.Amount ELSE 0 END) ASQ1,SUM(CASE P.Quarter WHEN 2 THEN P.Amount ELSE 0 END) ASQ2,SUM(CASE P.Quarter WHEN 3 THEN P.Amount ELSE 0 END) ASQ3,SUM(CASE P.Quarter WHEN 4 THEN P.Amount ELSE 0 END) AS Q4FROM Pivot1 AS PGROUP BY P.Year) AS P1GO---> even the P.QUARTER .... FIELD NAME IS BEING GENERATEDDYNAMICALLY.MY QUERY IS EXCEEDING VARCHAR AND NVARCHAR LIMIT.THANX IN ADV.
I have a question. I have two different kinds of inputs, and one has a length of 742, while the other has a length of 726. Is there a quick way to check the lengths using the conditional split transformer to send them down different paths?
Morning All,Can I have some help with this one please, I am having to make a fixed length text file based on information from the DBDeclare @EDIString varchar(MAX)Declare @RecordType varchar(2)Declare @RegistrationMark varchar(7)Declare @Model_Chassis varchar(11)Declare @LocationCode Varchar(4)Declare @MovementDate varchar(8)Declare @IMSAccountCode varchar(5)Declare @MovementType varchar(8)Declare @NotUsed1 Varchar(28)Declare @NotUsed2 varchar(7)Select @RecordType = RecordType, @RegistrationMark = RegistrationMark, @Model_Chassis = Model_And_Chassis, @LocationCode = LocationCode, @MovementDate = MovementDate, @IMSAccountCode = IMSAccountCode, @Movementtype = MovementTypeCode from Fiat_OutBoundOnce I have selected the information from the DB I need to ensure that each field is the correct length. I therefore want to pass the variable and the length of the variable into a function to return the correct length.So if location Code = 'AB' this needs to be four characters long so want to pass it into a function and return 'AB 'As I need to do this for 70+ variables is there an easy way to obtain the length of the collation for the variable?regardsTom
I have a requirement to import a file of rows containing fixed length data. The problem is that each row can be one of 5 different formats (i.e. different columns) -- where the "type" of row is indicated by the first two characters of the row. Each row gets inserted into its own table.
Could I use a simple Conditional Split to route the rows? Or is the split for routing similiar rows? Anyways, problems are never this simple...
In addition, each "grouping" of rows is related. The "first" row is considered the "primary" row (and gets a row id via IDENTITY, whereas the remaining rows in the group are "secondary" rows and have foreign key references back the the primary rows id.
Given (using spaces to separate columns and CrLf to show "grouping"):
So, the first 3 lines are all related to a MSFT record which needs to be spread across multiple tables. The next three lines are all related to AAPL, And the next FOUR lines (yes, each record can have zero, one, or more secondary rows) are related to CSCO.
(If this is still not clear, all the "01" rows will be written to [Table1] with each row having an IDENTITY value. All the "02" rows will be written to [Table2] the a FK pointing to the correct [Table1] row. All the "03" rows will be written to... and so on.
I need some help with a stored procedure to insert multiple rows into a join table from a checkboxlist on a form. The database structure has 3 tables - Products, Files, and ProductFiles(join). From a asp.net formview users are able to upload files to the server. The formview has a products checkboxlist where the user selects all products a file they are uploading applies too. I parse the selected values of the checkboxlist into a comma delimited list that is then passed with other parameters to the stored proc. If only one value is selected in the checkboxlist then the spproc executed correctly. Also, if i run sql profiler i can confirm that the that asp.net is passing the correct information to the sproc: exec proc_Add_Product_Files @FileName = N'This is just a test.doc', @FileDescription = N'test', @FileSize = 24064, @LanguageID = NULL, @DocumentCategoryID = 1, @ComplianceID = NULL, @SubmittedBy = N'Kevin McPhail', @SubmittedDate = 'Jan 18 2006 12:00:00:000AM', @ProductID = N'10,11,8' Here is the stored proc it is based on an article posted in another newsgroup on handling lists in a stored proc. Obviously there was something in the article i did not understand correctly or the author left something out that most people probably already know (I am fairly new to stored procs) CREATE PROCEDURE proc_Add_Product_Files_v2/*Declare variables for the stored procedure. ProductID is a varchar because it will receive a comma,delimited list of values from the webform and then insert a rowinto productfiles for each product that the file being uploaded pertains to. */@FileName varchar(150),@FileDescription varchar(150),@FileSize int,@LanguageID int,@DocumentCategoryID int,@ComplianceID int,@SubmittedBy varchar(50),@SubmittedDate datetime,@ProductID varchar(150) ASBEGIN DECLARE @FileID INT SET NOCOUNT ON /*Insert into the files table and retrieve the primary key of the new record using @@identity*/ INSERT INTO Files (FileName, FileDescription, FileSize, LanguageID, DocumentCategoryID, ComplianceID, SubmittedBy, SubmittedDate) Values (@FileName, @FileDescription, @FileSize, @LanguageID, @DocumentCategoryID, @ComplianceID, @SubmittedBy, @SubmittedDate) Select @FileID=@@Identity /*Uses dynamic sql to insert the comma delimited list of productids into the productfiles table.*/ DECLARE @ProductFilesInsert varchar(2000) SET @ProductFilesInsert = 'INSERT INTO ProductFiles (FileID, ProductID) SELECT ' + CONVERT(varchar,@FileID) + ', Product1ID FROM Products WHERE Product1ID IN (' + @ProductID + ')' exec(@ProductFilesInsert) EndGO
I have this doubt and want to be sure if my thinking is correct.
Lets consider 2 tables one with Fixed length columns (char) and other table with Variable length columns (Varchar).
The table with fixed length column will always allocate same size within a Page however, table with variable length column will allocate actual length of data within a page.
I think that updates happening on table with fixed length columns will have more possibility of InPlace updates at least from data length perspective, however updates on table with variable length columns will have more split updates from data length perspective.
is there any way or a tool to identify if in procedure the Parameter length was declarated less than table Column length ..
I have a table
CREATE TABLE TEST001 (KeyName Varchar(100) ) a procedure CREATE PROCEDURE SpFindNames ( @KeyName VARCHAR(40) ) AS BEGIN SELECT KeyName FROM TEST001 WHERE KeyName = @KeyName END KeyName = @KeyName
Here table Column with 100 char length "KeyName" was compared with SP parameter "@KeyName" with length 40 char ..
IS there any way to find out all such usage on the ALL Procedures in the Database ?
For those of you who would like to reference my exact issue, I'm dealing with the RSExecution SSIS package at the "Update Parameters" data flow task, at the Script Component.
The script tries to split parameter data into name and value. Unfortunately, I have several reports that are passing parameters that are very large. One example has over 65,000 characters all in the normal "¶mname=value&parm2=value..." format.
The code in the script works fine until it gets to one of these very large parameter sets. I have figured out what is causing the issue. Here's some code:
Dim paramBlob as Byte() paramBlob = Row.BlobColumn.GetBlobData(0, Row.BlobColumn.Length)
The second parameter of the .GetBlobData function takes an INTEGER as its count! Therefore, no matter what kind of datatype I pass to the string that the script will later split, it will be limited to 32767 characters.
THIS IS A PROBLEM!!!
Does anyone know a workaround for this issue? I need all of the parameter data to be reported, and I would hate to have to skip over rows like this. Also, if I'm missing something, please fill me in!
I am trying to narrow down this problem. Basically, I added 3 columns to my article table. It holds the article id, article text, author and so on. I tested my program before adding the additional field to the program. The program works fine and I can add an article, and edit the same article even though it skips over the 3 new fields in the database. It just puts nulls into those columns.So, now I have added one of the column names I added in the database to the code. I changed my businesslogic article.vb code and the addarticle.aspx, as well as the New article area in the addartivle.aspx.vb page. The form now has an additional textbox field for the ShortDesc which is a short description of the article. This is the problem now: The command parameters.length is 9 and there are 10 parameter values. Right in the middle of the 10 values is the #4 value which I inserted into the code. It says Nothing when I hover my mouse over the code after my program throws the exception in 17 below. Why is command parameters.length set to 9 instead of 10? Why isn't it reading the information for value 4 like all the other values and placing it's value there and calculating 10 instead of 9? Where are these set in the program? Sounds to me like they are hard coded in someplace and I need to change them to match everything else. 1 ' This method assigns an array of values to an array of SqlParameters.2 ' Parameters:3 ' -commandParameters - array of SqlParameters to be assigned values4 ' -array of objects holding the values to be assigned5 Private Overloads Shared Sub AssignParameterValues(ByVal commandParameters() As SqlParameter, ByVal parameterValues() As Object)6 7 Dim i As Integer8 Dim j As Integer9 10 If (commandParameters Is Nothing) AndAlso (parameterValues Is Nothing) Then11 ' Do nothing if we get no data12 Return13 End If14 15 ' We must have the same number of values as we pave parameters to put them in16 If commandParameters.Length <> parameterValues.Length Then17 Throw New ArgumentException("Parameter count does not match Parameter Value count.") 18 End If19 20 ' Value array21 j = commandParameters.Length - 122 For i = 0 To j23 ' If the current array value derives from IDbDataParameter, then assign its Value property24 If TypeOf parameterValues(i) Is IDbDataParameter Then25 Dim paramInstance As IDbDataParameter = CType(parameterValues(i), IDbDataParameter)26 If (paramInstance.Value Is Nothing) Then27 commandParameters(i).Value = DBNull.Value28 Else29 commandParameters(i).Value = paramInstance.Value30 End If31 ElseIf (parameterValues(i) Is Nothing) Then32 commandParameters(i).Value = DBNull.Value33 Else34 commandParameters(i).Value = parameterValues(i)35 End If36 Next37 End Sub ' AssignParameterValues38 39 40 41
I have the following situation; I have one table (tblA) in which a new record just has been inserted. Once this insert is completed successfully, I want to insert a variable number of records into another table (tblB). The primary key of tblA is being used inside tblB as one of the columns in each insert. I’ve already been able to transfer the primary key, generated by the insert for tblA, pretty easy. But to make things a bit more complicated, the variable number of records to add is being decided by the outcome of a query based on an entry inside tblA (after the insert) and this is then being run on another table (tblC). The SELECT statement from tblC combined with the Select parameter from tblA will then decide how many records I have to insert. Sorry for the (perhaps) confusing way of writing this down, but I’ve been struggling with this for a couple of days now and I really need to get it working. Anybody who can help?Thanks in advance,Sunny Guam
Hi All,Sorry if the subject line is too obscure -- I couldn't think of a wayof describing this request.I have a table that contains approximately 1 million records.I want to be able to be able to select the top x records out of thistable matching variable criteria.Pseudo table records:custid, category, segment1,1,12,1,13,1,14,1,15,1,26,1,27,1,28,1,29,2,110,2,111,2,112,2,113,2,214,2,215,2,216,2,217,2,318,2,319,2,320,2,3So, what I'm trying to do is return a recordset, for example, thatcontains the top 2 of each variation of category and segment.ie:1,1,12,1,15,1,26,1,29,2,110,2,113,2,214,2,217,2,318,2,3The only way I can think to achieve this is in a while statement,performing individual selects against each combination, feeding thewhere criteria by variables that I automatically increment.I can't help thinking there's a much more graceful way of achievingthis?If anyone can give me any insight into this I'd be incrediblyappreciative!Many thanks in advance!Much warmth,Murray
I am facing a strange problem in executing stored procedure. Basically my sproc will take a values from Java application and create a Insert statement. see stored procedure below.Just to give some more background- I am re writing the procedure which was written in oracle already.
Problem I am facing now is with the statement below . When I execute the procedure for first time it works fine however when I execute for second time onwards it is setting to empty. Not sure what is the problem with my declaration and setting up with values. For reference I have pasted my complete stored procedure code below.
writing the query for the following, I need to collapse the continuity. If the termdate for an ID is one day less than the effdate of the next id (for the same ID) i need to collapse the records. See below example .....how should i write the query which will give me the desired output. i.e., get min(effdate) and max(termdate) if termdate is one day less than the effdate of next record.
PROCEDURE ListFilteredEvents @FilterList varchar(200) -- contains ‘3,5’ AS SELECT EventID FROM Events WHERE (any value in Categories) IN @FilterList
Result:
EventID ---------- 2
How can I select all records where any value in the Categories column matches a value in @FilterList. In this example, record 2 would be selected since it belongs to category 3, which is also in @FilterList.
I’ve looked at the table of numbers approach, which works when selecting records where a column value is in the parameter list, but I can’t see how to make this work when the column itself also contains a comma delimited list.
I am trying to process an XML document that contains the attribute 'from_x'. However an openxml query can't seem to find any column with a '_x' suffix. For example if I were to execute the following fragment: