Normalizing Flat Row Of Data

Nov 7, 2005

I have a view with patient data. It looks like below

patid date pulmdc pulmstatus endodc endostatus
100 4/1/05 10 Good null null
100 5/1/05 10 Good 12 Poor

I want to create sql which by each patient, by date, by these four fields ,get

patid 4/1/05 pulmdc-10 pulmstatus-good
patid 5/1/05 pulmdc-1 pulmstatus-good
patid 5/1/05 endodc-12 endostatus-poor

View 1 Replies


ADVERTISEMENT

Normalizing Flat Data / Extract One Of Each Type

Nov 6, 2007

I'm new to SSIS and have run into a problem I'm hoping someone can help me with.

Basically, I have a flat file that looks something like:

ID,Type,Description,Results
1,Test1,This is a test,5
2,Test1,This is also a 1 test,7
3,Test1,This is also a 1 test,13
4,Test2,This is a second test,14
5,Test2,This is also a second test,18


I'm trying to normalize the data by extracting out individual rows that have the same "Type" column value. So what I want is to extract each unique type and description into a separate table. This would give me two new rows, one for a type of Test1, and one for a type of Test2, with the descriptions. Does this make sense? Then I could relate the individual results to these test types. In my scenario, I don't care which description is used; I just want to take the first description that shows up with the associated "Type."

Does anyone have any idea of how I could go about doing this? I could pull out all unique "Types" from the rows with the Aggregate transformation, but I'm trying to figure out how to get the description that goes along with it.

Thanks,

Brian

View 1 Replies View Related

Normalizing The Data

Jun 9, 2006

Hi,

I have a table like this:

Col1 First_Year Last_Year
a 1990 1993

I want this data to be converted to
a 1990
a 1991
a 1992
a 1993

Is there any simple way to do it in SSIS without using Script Component?

Thx

View 3 Replies View Related

Set Primary Key When Normalizing Data?

Apr 27, 2006

Greetings all,

I have created an SSIS package that takes data from a very large table (301 columns) and puts it in a new database in smaller tables. I am using views to control what data goes to the new tables. I also specified that it drop the destination table and recreate it prior to copying the data. The reason for this is so that old data removed from the larger database will get removed from the normalized databases.

I have 2 things I am trying to figure out..

1. I would like to have the package set a specific row in each new table to be the primary key (this will allow us to use relationships when querying the data).

2. I decided I wanted to sort the data as it copies. I am using the BI Visual Studio for my editing. In the Data Flow view I cannot seem to disconnect the output from the Source block so I can connect it to the Sort block and then feed that to the output block. What am I missing here?



Thanks

View 7 Replies View Related

SQL Server 2008 :: Normalizing Data Prior To Migration (Update String To Remove Special Characters)

Aug 21, 2015

I'm presented with a problem where I have a database table which must be migrated via a "custom tool", moving the data into a new table which has special character requirements that didn't exist in the source database. My data resides in an SQL Server 2008R2 instance.

I envision a one-time query which will loop through selected records and replace the offending characters with --, however I'm having trouble understanding how this works.

There are roughly 2500 records which meet the criteria of "contains bad characters", frequently containing multiple separate bad chars, and the table contains roughly 100000 rows.

Special Characters are defined as #%&*:<>?/{}|~ and ..

While the field is called "Filename" it isn't always so, it is a parent/child table where foldernames are also stored.

Example data:
Tablename = Items
ItemID Filename ListID
1 Badfile<2015>.docx 15
2 Goodfile.docx 15
3 MoreBad#.docx 15
4 Dog&Cat#17.pdf 15
5 John's "Special" Folder 16

The examples I'm finding are all oriented around SELECT statements, to change the output of what I see returned, however I'd rather just fix the entire column using an UPDATE. Initial testing using REPLACE fails because I don't always have a single character as the bad thing in a string.

In a better solution, I found an example using a User Defined Function to modify the output of a select, but I cannot use that UDF in an UPDATE.

My alternative is to learn enough C# to modify the "migration tool" to do this in-transit, but I know even less about C# than I do of SQL.

I gather I want to use @@ROWCOUNT to loop through the rows but I really can't put it all together in a cohesive way.

View 3 Replies View Related

Normalizing My Database

Nov 8, 2006

Please i have created some tables Delivary with this columns (DelivaryId,DelivaryNo,QtyRecieved,DelivaryDate,ProductId) and Product with this columns (ProductId,ProductCode,ProductName,ProductPrice) as you can see the product table keeps record of products whlie the delivary table keeps record of stock supplied. I will like to create another table that will keep record of stock sold out (Invoice Table) based on the qty recieved from the delivaries table
Please help

View 6 Replies View Related

I Need Help Normalizing This Table.

Aug 10, 2005

So I'm creating an administrative back end for a site that's already been created, and whoever made the tables the site uses didn't know much about database design. So I need to normalize this table of Links so it can be easier to have someone make changes and updates to it, but then I need to put all my normalized tables back together to create a View exactly like the old table which the old site can select from. Basically the stipulation is I can't change the code for the old site so I have to make it think it's still selecting from the same table with the same type of parameters. Is it worth doing all this? Or should I just tough it out with this really ugly table?Here's the table: and here's the site that uses this table:http://waahp.byu.edu/links.aspThanks!~Cattrah~

View 3 Replies View Related

Normalizing Using Sql Commands

Aug 28, 2006

Hi

Please can someone point me in the direction, i built a very badly designed database consisting of only one huge table when i first started databases, since learning about normalization i have designed and set up a new database which consists of many more tables instead of just the one. My question is where do i start in transfering the data from the old single tabled database to my new multi-tabled database?

I have MS SQL server 2005 managment studio if that helps, but want to transfer around 200,000 rows of data into the new database. Both new and old databases are on the same server.

thanks in advance

View 11 Replies View Related

Primary Key, Normalizing?

Feb 24, 2004

I am a beginner, so please bare with me. I get very confused on how to normalize my database.

Firstly: The employees in the company I work for are in various departments and can have more then one title and work in more then one department.

Example: John Smith can work in the engineering department as a detailer and an engineer and at the same time work as a project manager for the management department.

How do I setup this table structure?


Employees Table
Login (PK) | First | Last | Extension.......
---------------------------------------------
jsmith | John | Smith | 280

Department Title Breakdown
Department | Title
--------------------------
Engineering | Detailer
Engineering | Engineer
Management | ProjectManager

Job Description
Login | Title
-------------------------
jsmith | Engineer
jsmith | Detailer
jsmith | ProjectManager


This is important to break this down because for each project the following is saved:


Project Listing
Project | Detailer | Estimator | Sales | Engineer |....... | Location
10001 | jsmith | jdoe | mslick | sjunk | ...... | Las Vegas


Or should the project be broken down as well

Project Listing
Project | Location
10001 | Las Vegas

Project Team
Project | Member | Activity
10001 | jsmith | Engineer
10001 | mstevens | Detailer


Any thoughts on how to normalize this?

Mike B

View 6 Replies View Related

De-normalizing Query

Jul 13, 2006

I have this table...CREATE TABLE #Test (ID char(1), Seq int, Ch char(1))INSERT #Test SELECT 'A',1,'A'INSERT #Test SELECT 'A',2,'B'INSERT #Test SELECT 'A',3,'C'INSERT #Test SELECT 'B',1,'D'INSERT #Test SELECT 'B',2,'E'INSERT #Test SELECT 'B',3,'F'INSERT #Test SELECT 'B',4,'G'....and am searching for this query....SELECT ID, Pattern=...?? FROM #Test....??....to give this result, where Pattern is the ordered concatenation ofCh for each ID:ID PatternA ABCB DEFGThanks for any help!Jim

View 2 Replies View Related

Normalizing A Crosstab

Aug 24, 2006

I re-designed a predecessor's database so that it is more properlynormalized. Now, I must migrate the data from the legacy system intothe new one. The problem is that one of the tables is a CROSSTABTABLE. Yes, the actual table is laid out in a cross-tabular fashion.What is a good approach for moving that data into normalized tables?This is the original table:CREATE TABLE [dbo].[Sensitivities]([Lab ID#] [int] NULL,[Organism name] [nvarchar](60) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[Source] [nvarchar](20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[BACITRACIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[CEPHALOTHIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[CHLORAMPHENICOL] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[CLINDAMYCIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[ERYTHROMYCIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[SULFISOXAZOLE] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[NEOMYCIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[OXACILLIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[PENICILLIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[TETRACYCLINE] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[TOBRAMYCIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[VANCOMYCIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[TRIMETHOPRIM] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[CIPROFLOXACIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[AMIKACIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[AMPICILLIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[CARBENICILLIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[CEFTAZIDIME] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[GENTAMICIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[OFLOXACIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[POLYMYXIN B] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[MOXIFLOXACIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[GATIFLOXACIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[SENSI NOTE] [nvarchar](255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL) ON [PRIMARY]

View 5 Replies View Related

Normalizing Address Information...

Dec 27, 2003

THE LAYOUT:
I have two tables: "Applicant_T" and "StreetSuffix_T"

The "Applicant_T" table contains fields for the applicant's current address, previous address and employer address. Each address is broken up into parts (i.e., street number, street name, street suffix, etc.). For this discussion, I will focus on the street suffix. For each of the addresses, I have a street suffix field as follows:

[Applicant_T]
CurrSuffix
PrevSuffix
EmpSuffix

The "StreetSuffix_T" table contains the postal service approved street suffix names. There are two fields as follows:

[StreetSuffix_T]
SuffixID <-----this is the primary key
Name

For each of the addresses in the Applicant_T table, I input the SuffixID of the StreetSuffix_T table.


THE PROBLEM:
I have never created a view that would require the primary key of one table to be associated with multiple fields of another table (i.e., SuffixID-->CurrSuffix, SuffixID-->PrevSuffix, SuffixID-->EmpSuffix). I want to create a view of the Applicant_T table that will show the suffix name from the StreetSuffix_T table for each of the suffix fields in the Applicant_T table. How is this done?

View 6 Replies View Related

Need Help Normalizing Multivalue Column

Jan 23, 2006

Hi!


I have a table with the following columns:

account_nr, account_totaling_members, account_type



the account_totaling_members column contains a pipe sperated list of accounts in a varchar: "1001|1002|1003"

I need to normalize this so that i get records like:

"10", "1001", "sum"

"10", "1001", "sum"

"10", "1002", "sum"

..and so forth



Does anyone have any idea how to accomplish this?

View 3 Replies View Related

How To Populate Foreign Key In Normalizing Import?

Jun 21, 2006

I am copying data from one denormalized table to a COUPLE of normalized ones.
I am using multicast, following advices from the forum.

The problem I have is that the two destination tables (A and B) are sharing a foreign key relationship.Filling in A is no problem, but when I want to fill in B, I don't know how to populate its foreign key, since the multicast doesn't know the corresponding primary key in table A.

View 9 Replies View Related

Integration Services :: Best Way To Value Data Column In Data Pump From A Flat File

Aug 28, 2015

I have to value [CreateDate] in the data pump of my Flat File Source into my OLE DB Destination SQL Server Table. With a Variable within the SSIS Package or with a Derived Column task within the Data Flow between the Flat File Source and OLE DB Destination?

View 2 Replies View Related

Exported Flat File Data Will Not Import To Same Table Without Extensive Data-type Manipulation

Jul 13, 2007

I'm moving data between identical tables and have to use a flat file as an intermediary. I thought: "No problem, SSIS can do a quick export to a file, then move the file to another server, then use SSIS to import the data to the new server."



Seems simple, right?



I'm hitting all sorts of surprising data conversion errors. I used the export wizard to create the export package. This works fine. However using the same flat file definition, the import package fails -- even when I have no destination. That is I have just one data flow task that contains only one control: the Flat File source. When I run the package the flat file definition fails with data type conversion and truncation errors. One of the obvious errors is for boolean types. The SQL field is a bit, SSIS defined the column as DT_BOOL, the output of the data are literal text values "TRUE" and "FALSE". So SSIS converts a sql datatype of bit to "TRUE" and "FALSE" on export, but can't make the reverse conversion on import?



Does anyone else find this surprising? I would expect that what SSIS exports, it can import given all the same table and flat file definitions. Is SSIS the wrong tool to do such simple bulk copies? I'd like to avoid using BCP because this process will need to run automatically within SQL Agent so we can leverage all the error tracking and system monitoring.



View 12 Replies View Related

SQL Server 2012 :: Normalizing A Column Containing Lists

Aug 20, 2015

CREATE TABLE CATEGORIES(CATEGORYID VARCHAR(10), CATEGORYLIST VARCHAR(200))

INSERT INTO CATEGORIES(CATEGORYID, CATEGORYLIST) VALUES('1000', 'S01:S03, S09:S20, S22:S24')
INSERT INTO CATEGORIES(CATEGORYID, CATEGORYLIST) VALUES('1001', 'S11:S12')
INSERT INTO CATEGORIES(CATEGORYID, CATEGORYLIST) VALUES('1002', 'S30:S32, S34:S35, S60')
INSERT INTO CATEGORIES(CATEGORYID, CATEGORYLIST) VALUES('1003', 'S40')

The CATEGORYLIST strings are composed of value ranges separated by a colon (:) and multiple value ranges separated by a comma.

The results I need are:

CATEGORYID STARTRANGE ENDRANGE
1000 S01 S03
1000 S09 S20
1000 S22 S24
1001 S11 S12
1002 S30 S32
1002 S34 S35
1002 S60 S60
1003 S40 S40

I have tried taking the original data and parsing it out as an XML file. Is there a less cumbersome way to do this in TSQL?

View 1 Replies View Related

DB Design :: Normalizing Personal Contact Information

May 22, 2015

I have a large data set with 10s of millions of rows of contact information.  The data is in CSV format and contains 48 columns of information (First name, MI, last name, 4 part address, 10+ demographic points, etc.) and I'm struggling with how I should design the database and normalize this data, or if I should normalize this data.

My 2 thoughts for design were either:

Break the columns into logical categorical tables (i.e. BasicContactInfo, Demographics, Financials, Interests, etc.) Keep the entire row in one table, and pull out the "Objects" into another table (i.e. ContactInformation, States, ZIPCodes, EmployementStatus, EthnicityCodes, etc.)

The data will be immutable for the most part, and when I get new data, I'll just create a new database and replace the old one.

The reason I like option 1 is because it makes importing easier, since I can just insert the appropriate columns from each row into the appropriate tables.  Option number 2 feels like it would be faster to get metrics on the data, like how many contacts live in which states, or what is the total number of unique occupations in the data set.  Plus I'll be able to make relationships between the tables, like which state is tied to which zipcode, which city is tied with which county, etc.  Importing that data might be more tricky, since I don't think SQL Bulk Copy will allow for inserting into normalized tables like that.

The primary use for this data is to allow our sales force to create custom lists of contact information based on a faceted search page.  The sales person would create the filter, and then I will provide them with the resulting data so they can start making business contacts.  Search performance needs to be good.  Insert, update, and deletes won't happen once the data has been imported.

What should I look for in designing this database?  Any good articles on designing tables around wide data sets like my contact information? 

View 6 Replies View Related

T-SQL (SS2K8) :: Load Data From Flat File Source Into OleDB Destination By Changing Data Types In SSIS

Apr 16, 2014

I have an source file and i have to load it into the data base by changing datatype of the columns in ssis

View 1 Replies View Related

Normalizing Comma Separated String To Multiple Records

Oct 17, 2012

I need to normalise comma separated strings of tags (SQL Server 2008 R2).

E.g. (1, 'abc, DEF, xyzrpt') should become
(1, 'abc')
(1, 'DEF')
(1, 'xyzrpt')

I have written a procedure in T-SQL that can handle this. But it is slow and it would be better if the solution was available as a view, even a slow view would be better.

Most solutions I found go the way round: from (1, 'abc'), (1, 'DEF') and (1, 'xyzrpt'), generate (1, 'abc, DEF, xyzrpt').

If memory serves, it used "FOR XML PATH". But it's been a while and I may be totally wrong.

View 2 Replies View Related

SQL Server 2012 :: Formatting XML Output - Avoid Normalizing Structure On Client

May 28, 2015

I have a script that resolve's data into xml like this, ex:

<root>
<title>A</title>
<id>1</id>
<nodes>
<node>
<id>2</id>
<title>A.1</title>
</node>
</nodes>
</root>

And works perfectly, but ... how to make sure every item has an element "nodes" ? The case here is for the child leafs obviously. This, because on the client i have to inject this element "nodes" on a json version of this xml, and just wanted to avoid normalizing the structure on the client.

For the root I am using

FOR XML PATH('root'),TYPE; and for the hierarchy that follows
FOR XML RAW ('node'), root('nodes'), ELEMENTS

View 0 Replies View Related

Exporting SQL Data To A Flat File

Mar 19, 2002

SQL 6.5
NT 4.3

Can someone assist me with the following....

1. I am attempting to export data from a SQL DB (single table using a query) to a "flat file".
2. I would then like to take this "flat file" and import the data into a different SQL DB (same schema structure as first DB).

Unfortunately this has to be done in two steps.


Thank you for your help.

RPowid

View 3 Replies View Related

Archiving Data To Flat File?

May 24, 2002

How do I put data into a text or excel file before I attempt a deleteion from a large table. I know how to select the necessary data, but i'm not sure about the t-sql required to put it into a file?

are there any better methods of archiving?

thanks

View 3 Replies View Related

Append Data To A Flat File

Nov 20, 2007

Hello,

I was wondering if there was a way for me to append data to a flat file. The reason why i ask this is because i need to create a header for the report that i am exporting.

The way i imagined this working would to be create a dts that would export the header information to a flat file and the create another dts to export the report data and appensd it to the same file that the header dts created. This might not be the correct approach so i was hoping i could get some guidance of how i can accomplish this.

I am using SQL Server 2000.

Thank you!

View 4 Replies View Related

Get Header Data Of Flat File

Jul 4, 2007

hi everyone!

i am currently creating a package which involves getting data from CSV files. i can successfully get the data from the files, my problem is, i need to get data from the header of the CSV files. i am currently skipping the header rows. the format of the CSV files is as follows:

-----------------------------------------------------------------------------------
Date, 20070704
Store Code, storeCode1

data row.....
data row.....
data row.....
-----------------------------------------------------------------------------------

technically, i also need the date from the header row, but since it is also indicated in the data rows, i have no problem with that. what i need is the Store Code, which is not indicated on the data rows. i need to store the data in a database in the following format:

-----------------------------------------------------------------------------------
StoreCode Date column1 column2 column3 ......
storeCode1 20070704 ...
storeCode2 20070704 ...
storeCode3 20070704 ...

-----------------------------------------------------------------------------------


any idea how SSIS can handle this? thanks a lot!

View 4 Replies View Related

Exporting Data To Flat File

Aug 17, 2006

I'm using SSIS package to export some data to a comma delimited CSV file. The problem is that some of the fields have commas in them. Is there a way to deal with this other to changing the delimiter?

View 2 Replies View Related

Exporting Data To A Flat File

Aug 29, 2006

the "flat file" destination is missing from the choices when attempting to

export data.

View 1 Replies View Related

Flat File With Nested Data

Nov 1, 2006

I am looking to import data into SQL Server 2005 using SSIS. I want to take data that is contained in a flat file and place it into the various appropriate tables in my system. The flat file contains nested data. For example...

Bob,Smith,555-5555,123~3.33|245~1.99,Active

So I want to build a package that brings in the records as follows

Client Table: First Name, Last Name, Phone, and Status (Bob, Smith, 555-5555, Active)

Order Table: OrderID, Amount (ID 123 @ $3.33 and another row ID 245 @ $1.99). If possible I would also like to tie the orders to the client record that was inserted.

My first question is if SSIS supports nested fields as in my example. Can it break a file by commas, then within a field by other delimiters? If so how do I do this, and if not what is the recommend way to accomplish this sort of task.

My second quesiton is if it can do that, can it tie the Client and Order data on the fly?



Thanks.

View 11 Replies View Related

Importing Data From A Flat File

Mar 11, 2007

I have a flat file data source - call it "order". Its a text file that looks something like this:

ORDERNAME| Example1

CUSTOMER|Acme Industries
COST|11611
ITEMS
B1|550S162-43(33)|35.708|1|636
T1|550S162-43(33)|20.967|1|636
T2|550S162-43(33)|20.967|1|636
W1|350S162-43(33)|1.330|2|501
W21|350S162-43(33)|1.330|1|911
W2|350S162-43(33)|3.044|2|501
W20|350S162-43(33)|3.044|1|911

I would like to write the metadata to a [order header] table and the ITEMS to a [order detail] table. Can someone direct me to a example of something similar?

View 11 Replies View Related

Flat File Data Flow

Apr 17, 2007

any suggestions on dealing with a flat file in the format below. I only want to process the data columns in the middle of the file and want to ignore all other rows. This was a very simple task in DTS with a small amount of VBScript in the transformation but it doesn't seem as straightforward in SSIS. thanks



......... file example ......

start-of-file

header1

header2

...

start-of-data

col0|col1|col2|col3|....

col0|col1|col2|col3|....

col0|col1|col2|col3|....

end-of-data

end-of-file

View 3 Replies View Related

Flat File Data Source

Aug 29, 2006

Is there away to use wild card in the file name for the flat file data source?

Like //servername/directory/*.txt

View 5 Replies View Related

Copy Data From A DB To A Flat File Using Bcp

Nov 1, 2007



I'm trying to import some data from a data base table to a flat file. But I get a error message when I run BCP command in coomand line.
My bcp command is as follows.

C:Documents and Settingsshamen>bcp databasename.tablename out -c-f flatfile.txt -T MSSQLSERVER

When I run this, I get " an error has occired when connecting to the server. This faillure may be caused by default SQL server does not allow remote connection.


Then I Changed it as follows.

C:Documents and Settingsshamen>bcp databasename.tablename out flatfile.txt -T-c
Then I get " Unknow argument ' BCPfile.txt' on command line "

Does any tell me how to fix this?

Thanks

View 9 Replies View Related

Bringing Data Into A Flat File

Apr 18, 2007

I am trying to bring in the result fo a query to a flat file. The source and destination connections are oK, the mapping is also correct. However I get the following error during execution:



[Flat File Destination [507]] Error: No column was specified to allow the component to advance through the file.

followed by this:

[DTS.Pipeline] Error: component "Flat File Destination" (507) failed the pre-execute phase and returned error code 0xC02020F0.



What is the fix?

View 7 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved