T-SQL (SS2K8) :: Create Separate MS Excel Files By Looping Through Large Table

Jun 24, 2014

I have a master table containing details of over 800000 surveys made up of approximately 400 distinct document names and versions. Each document can have as few as 10 questions but as many as 150. Each question represents one row.

My challenge is to create a separate spreadsheet for each of the 400 distinct document names and versions containing all the rows and columns present in the master table. The largest number of rows would be around 150 and therefore each spreadsheet will not be very big.

e.g. in my sample data below, i will need to create individual Excel files named as follows . . .
"Document1Version1.xlsx" containing all the column names and 6 rows for the 6 questions relating to Document 1 version 1
"Document1Version2.xlsx" containing all the column names and 8 rows for the 8 questions relating to Document 1 version 2
"Document2Version1.xlsx" containing all the column names and 4 rows for the 4 questions relating to Document 2 version 1

I assume that one of the first things is to create a lookup of the distinct document names and versions assign some variables and then use this lookup to loop through and sequentially filter the master table data ready for creating the individual Excel files.

--CREATE TEMP TABLE FOR EXAMPLE

IF OBJECT_ID('tempdb..#excelTest') IS NOT NULL DROP TABLE #excelTest
CREATE TABLE #excelTest (
[rowID] [nvarchar](10) NULL,
[docName] [nvarchar](50) NULL,

[Code] .....

--Output

rowIDdocNamedocVersionquestionblankField
1document11q1NULL
2document11q2NULL
3document11q3NULL
4document11q4NULL
5document11q5NULL
6document11q6NULL

[Code] .....

View 9 Replies


ADVERTISEMENT

SSIS Create Large Temp Files!!!

Oct 22, 2007



Hello,

I created a SSIS solution for reading data from dbase and storing them in SQL Server. In a ForEachDirectory-Loop up to one thousand dbase files are read and stored. The system where the packages are running has 16 GB RAM.
For the first few hundred dbase files everything goes fine, but then, the RAM seems not to suffice any more and a temp file is created (I changed the path in BufferTempStoragePath).

How can it be that there is a need to create temp files if there is so much RAM available?
Why is the RAM filled more and more during the SSIS package execution?
Is there anything I can do to release some of it? (it is running in a loop and there is no need to store all the data)
Could it be caused by dbase?? (I use Microsoft Jet 4.0 OLE DB Provider)

Another thing is that the temp file is not stored in the path I set in BufferTempStoragePath.
There are sufficient permissions set, but temp file is still created in user temp folder...

Any kind of help is very much appreciated!

Best Regards,
Stefan

View 5 Replies View Related

SQL 2012 :: Create Script That Will Import Large XML Files?

Jul 28, 2014

I need to create script that will import large XML files (500 - 7GB) on a daily basis and store the data in a relational db structure.

What is the best and fastest way of importing such files. I have played around with smaller files and found the following.

1. SSIS XML Data Source: It doesn't seem to like the complex elements types and throws out the file.
2. Using Bulk File Import, sorting the file in XML variable and using XQuery to parse the file: This works but it can't take a file more than 2GB in size, so I can't use this method.
3. C# + XML Serialization: This also works, but seems to be terribly slow. I open the DB connection once, so it doesn't open and close for each db call, but still seems like it takes a long time.

how to import large XML quickly in a relational table structure?

View 9 Replies View Related

T-SQL (SS2K8) :: Create Batch File To Selectively Run Files

Apr 24, 2014

I have about 1200 sql files in one of my folders. Almost all of these files do data inserts and updates, so they should be run only once. As and when required I have manually ran around 150 of them already. Whenever I ran any of these scripts, I log that file name into a log table in my sql server including the execution time. Since running 1000+ more files takes a lot of time, I want to automate running of these files through a batch file. But I also want to filter the files that are already run.My file list looks like follows.

InsertToOrderTypes.sql
UpdateClientAddress.sql
DeleteDuplicateOrders.sql
InsertToEmailAddress.sql
ConsolidateBrokerData.sql
UpdateInventory.sql
EliminateInvalidOfficeLocations.sql

My log table in the database looks like this.

select * from sqlfileexecutionlog
FileNameRunTimeResult
---------------------
DeleteDuplicateOrders.sql03/12/2014 14:23:45:091Success
UpdateInventory.sql04/06/2014 08:44:17:176Success

Now I want to create a batch file to run the remaining files from my directory to my sql server. I also want to wrap each of these sql file executions in a transaction and log success/failure along with the runtime and filename into sqlfileexecutionlog table. As I add new sql files into this directory, I should be able to run the same batch file and execute only the sql files that have not bee run.

View 9 Replies View Related

T-SQL (SS2K8) :: Getting Minimum And Maximum Values In A Large Table

May 23, 2014

Table definition:

Create table code (
id identity(1,1)
code
parentcode
internalreference)

There are other columns but I have omitted them for clarity.

The clustered index is on the ID.

There are indexes on the code, parentcode and internalreference columns.

The problem is the table stores a parentcode with an internalreference and around 2000 codes which are children of the parentcode. I realise the table is very badly designed, but the company love orms!!

Example:
ID| Code| ParentCode| InternalReference|
1 | M111| NULL | 1|
2 | AAA | M111 | 2|
3 | .... | .... | ....|
4 | AAB | M111 | 2000|
5 | M222 | NULL | 2001|
6 | ZZZ | M222 | 2002|
7 | .... | .... | .... |
8 | ZZA | M222 | 4000|

The table currently holds around 300 millions rows.

The application does the following two queries to find the first internalreference of a code and the last internal refernce of a code:

--Find first internalrefernce
SELECT TOP 1 ID, InternalReference
FROM code
WHERE ParentCode = 'M222'
Order By InternalReference

-- Find last ineternalreference
SELECT TOP 1 ID, InternalReference
FROM code
WHERE ParentCode = 'M222'
Order By InternalReference DESC

These queries are running for a very long time, only because of the sort. If I run the query without the sort, then they return the results instantly, but obviously this doesn't find the first and last internalreference for a parentCode.

I realize the best way to fix this is to redesign the table, but I cannot do that at this time.

Is there a better way to do this so that two queries which individually run very slowly, can be combined into one that is more efficient?

View 7 Replies View Related

Create, Insert And Format Into 1,000 Excel Files

Feb 24, 2008

Hello,

I need to create about 1,000 (literately) Excel files that each contain 5 tabs. The data being placed on the tabs will always be the same (meaning the columns are static).
I am fairly advanced at Excel VBA so I can write code that does all the following in Excel (looped 1,000 times):


Open an Excel template

Bring data in from the tables

Filter, then copy-paste the appropriate rows into each tab.

Save the new Excel file.

Email the file to appropriate individual (it is a Microsoft Exchange Server).
As I started this in VBA, I thought that I might be able to do it with SSIS. My concern is I need to have the rows formatted (font, border, etc.) and the number of rows change.

My questions are:
Is it possible to format Excel with SSIS?
Can I email the files even if it is not with an SMTP protocol?
Would SSIS process this data faster then Excel?
Does this approach even make sense? Am I better just doing it with VBA?


Thank you for the help.

-Gumbatman

View 4 Replies View Related

SQL Server Admin 2014 :: Separate Data Files / Log Files / TempDB / Backups

Jan 9, 2015

I proposed on a new server that we separate Data Files, Log Files, tempDB, Backups, etc. onto separate LUNS on a SAN with High Speed Solid State Drives.I was told that with the new technology with solid state SAN's that it would decrease performance and that it did not work the same way as it did when you had RAID 5's etc.I thought that if things were cared out correctly by a SAN Administrator they would know how to configure for optimal performance.

View 2 Replies View Related

How Do Create Table From Excel (based On Excel Column Name) And Import Data From It?

Jun 14, 2006

l've the following situation,

l've some excel files controlled by Vendor which changing frequently. The only thing does not change is the header name of each column.

So my question is, is there any way to create a new table based on the excel file selected including the column name in SSIS? So that l can use the data reader as source to select those columns l am interested on and start the integration.


Thanks.

Regards,
Yong Boon, Lim


p/s : The excel header is at the row 7.

View 3 Replies View Related

CREATE INDEX On Large Table

Jul 23, 2005

SQL Server 7/2000: We have reasonably large tables (3,000,000 rows)that we need to add some indexes for. In a test, it took over 12 hoursto CREATE a new INDEX against this table. One of us suggested that wecreate a temp table with the new index and copy the data from the oldtable into the new one, then rename it. I understand this took 15minutes. Why the heck would it be faster to move the data and buildmultiple indexes incrementally vs adding an index??

View 11 Replies View Related

SQL 2012 :: Create Clustered Index On A Very Large Table (500 GB)

May 7, 2014

I need to create a Clustered Index (CI) on a very large SQL Server 2012 database table. This table has about approximately 10 billion rows, 500 GB in size. The job ran for about 20 hours into it and then fails with error: "Out of disk space in tempdb". My tempDB size is 1.8TB, but yet it's still not enough.

Here is my script:

CREATE CLUSTERED INDEX CI_IndexName
ON TableName(Column1,Column2)
WITH (MAXDOP= 4, ONLINE=ON, SORT_IN_TEMPDB = ON, DATA_COMPRESSION=PAGE)
ON sh_WeekDT(Day_DT)
GO

View 9 Replies View Related

T-SQL (SS2K8) :: Procedure That Create Views With Table Name And A Table Field Parameter?

Aug 4, 2015

I would like to create a procedure which create views by taking parameters the table name and a field value (@Dist).

However I still receive the must declare the scalar variable "@Dist" error message although I use .sp_executesql for executing the particularized query.

Below code.

ALTER Procedure [dbo].[sp_ViewCreate]
/* Input Parameters */
@TableName Varchar(20),
@Dist Varchar(20)
AS
Declare @SQLQuery AS NVarchar(4000)
Declare @ParamDefinition AS NVarchar(2000)

[code]....

View 9 Replies View Related

T-SQL (SS2K8) :: Import Values From Excel Into Table?

Nov 18, 2014

I've already created a table and i wanna to insert values in that more than five hundred row ,that values are stored in Excel files, Here I've the doubt is it possible to insert values from excel sheet? I've current data base of ms sql 2000, if it is possible means, how to insert values using query?

ex:

create table test
(
item_code int,
item_name varchar(50),
bill_qty float,
bil_date datetime
)

In that excel file also had the same column name name.

View 4 Replies View Related

SQL 2012 :: Generating CREATE TABLE Scripts For Large Number Of Tables

Feb 11, 2014

Other than right-clicking on each individual table in SSMS and generating a CREATE script, is there a simple way to generate CREATE TABLE scripts for tables within a given database?

Background: I have a bunch of tables in one database, and I would like to add tables to a second database that have the same names and basic structures of some of the tables from the first database.

I do not need to transfer any data from the tables, this is a seperate project that will use a similar data structure. I just want to generate the CREATE TABLE scripts for 30ish tables within the first database, and then I'll tweak the scripts as appropriate and run them against the new database.

[URL] ....

View 7 Replies View Related

Large Amount Of Text Pushing To Separate Page After PDF Render

May 25, 2006

I am having a small problem.

When I generate a report with a large amount of text data (Paragraphs) the report will not split the text between two pages.

It will move the entire text box to the next page leaving a large amount of space on the previous page.

I tried every control on the page to render this. I tried textboxes, tables, lists, rectangles, subreports, etc..

The data is stored in a SQL table using a text data type (there can be a large amount of data entered into these database.

Any help on this would be great. I like reporting services, but there are just a few bugs.

I am using Reporting Services 2000 with Service Pack 2 and Hotfix

http://support.microsoft.com/?kbid=912424 (That fixed a different PDF rendering issue) and this problem was occuring prior to this hotfix.

Thanks in advance

View 5 Replies View Related

T-SQL (SS2K8) :: Import To Table From Varying Tab Delimited Text Files

Feb 10, 2014

I need to import data to a MSSql table from massive (read: a million and a half rows, every single day) logs that come in .txt format separated in tabs with a ";" symbol and then have some stored procedures analyze that data to generate some reports in an excel file with that info. The text files include the column headers in the first row and the data starts on the second one.

The challenge is that the text files differ in column order and count every single day.

The analysis that I need to do only needs about 15 columns from the nearly 90-120 that those files include, and those columns sadly happen to be in a different order in those files.

View 8 Replies View Related

Looping Through Files

Jan 19, 2008

HI:

I am trying to loop through files in a directory. The tricky part here is that the last part of the filename has seconds, and I never know what that is. I need to use some sort of wildcard I think here. Basically I have a directory with these files:

Export20080112_01_00_08.csv
Export20080113_01_00_06.csv
Export20080114_01_00_03.csv

Once you have got the YYYYMMDD part you have a unique file, so the rest I don't care about. If I could just open the files the way you use a wildcard when you do Windows file searches I could store these in a table:

Export20080112*.csv
Export20080213*.csv
Export20080114*.csv


then use the For Each Container in SSIS for ADO and loop through the filenames. The other option is to use the For Each File but that has to loop through all the files.

Is there a good way to do this using a wildcard?

Thanks,
Kayda

View 3 Replies View Related

T-SQL (SS2K8) :: Create Table Dynamically?

Sep 5, 2014

I am having SP which gives, two result sets. The columns which are coming from result sets are also dynamic.
i.e. some time 5 columns and some time 10 columns.

Now I want to load this output into 2 different tables on daily basis. This would be truncate/delete table and load again.

Now my problem is that as I am not sure about columns, Is it possible to create table(Physical Table) depends on output of SP, and after load data into it.

During each load we can drop table, No issue and we can handle this through SSIS Package.

View 2 Replies View Related

Looping Over Files Not Available In My Ssis

May 22, 2006

I'm downloading zip'd files and would like to loop through each file that was downloaded. I'd also like to unizip each file and append all of them to one file. I have a dos batch that is fairly simple and would like to emulate it using ssis. Here is what the dos batch file looks like.



DATE /T >%TEMP%D.txt

FOR /F "usebackq tokens=2,3,4 delims=/, " %%i IN ("%TEMP%D.txt") DO SET fname=TAMF_162%%i%%j%%k-%%k.zip

ECHO xxx>zzzzz
ECHO xxxxx>>zzzzz
ECHO BINARY>>zzzzz
ECHO GET %fname%>>zzzzzz
ECHO QUIT>>zzzzz

FTP -s:zzzzzzz ftp.aaaaa.com

PKUNZIP -o -xxxxxxx downloadedfile_1~1.ZIP

DEL TAMF_1~1.ZIP
DEL zzzzzzzz

EXIT

View 11 Replies View Related

Looping Thru Files By Dates

Mar 31, 2008

how to loop through files by date using a for each loop container
for example if i have files with names
filex_20080405103050
filex_20080405103055
filex_20080405103520
then the file filex_20080405103050 should be picked and loaded first.

View 4 Replies View Related

T-SQL (SS2K8) :: Create A Table Valued Function?

Oct 20, 2014

I would like to create a table valued function using the following data:

create table #WeightedAVG
(
Segment varchar(20),
orders decimal,
calls int
);
insert into #WeightedAVG

[code].....

I would like to create a function from this where I can input columns, and two numbers to get an average to output in a table ie,

CREATE FUNCTION WeightedAVG(@divisor int, @dividend int, @table varchar, @columns varchar)
returns @Result table
(
col1 varchar(25),
WeightedAVG float

[Code] .....

View 4 Replies View Related

T-SQL (SS2K8) :: Create Dynamics View Which Contain Data Of All Table

Apr 16, 2014

I have view something like

Create view All_employee
AS
SELECT Emp_Name, Emp_code FROM dbo.Employee
UNION ALL
SELECT Emp_Name, Emp_code FROM Emp_201402.Employee

But we have a different "Schema" for same table because we have archive table with same table name but with different schema name. Now we have req to make view which contain data of all table. But I can't seem to figure out how to do it in a view.

SET NOCOUNT ON
DECLARE @Count INT, @TotalCount INT, @SQL VARCHAR( MAX )
DECLARE @Schema TABLE ( ID INT, NAME VARCHAR(512) )
INSERT INTO @Schema
SELECT ROW_NUMBER() OVER (ORDER BY SCHEMA_ID), Name FROM sys.schemas where name like '%emloyee%' ORDER BY schema_id ASC

[Code] ....

Don' think that works.

Is this possible with a view or it other way to do it?

View 7 Replies View Related

Looping Through An Excel Spreadsheet

Feb 23, 2006

Being new to SSIS I wish to loop through a series of excel spreadsheets and within each workbook loop through each sheet. I am aware of the For Each container but how can the each sheet in the workbook be referenced?

Steve

View 42 Replies View Related

T-SQL (SS2K8) :: How To Cut Certain Values In A String To Separate Columns

Jun 5, 2014

I have a column containing values for different languages. I want to cut out the values per languate in a seperat column.

The syntax is a 2 letter country code followed by : the value is contained in double quotes. each languate is separated by a ; (except for the last one)

EX ur English, Dutch and Swedish:US:"Project/Prescription sale";NL:"Project/specificatie";SW:"Objektsförsäljning"

The result would Be
column header US
with value Project/Prescription sale

next column header NL
with value Project/specificatie etc.

Here are table examples:

IF OBJECT_ID('[#SALETYPE]','U') IS NOT NULL
DROP TABLE [#SALETYPE]

CREATE TABLE [#SALETYPE](
[SaleType_Id] [int] NOT NULL,
[name] [nvarchar](239) NOT NULL,

[Code] ....

View 9 Replies View Related

Looping Through Data And Outputting Text Files

Mar 20, 2008

I have this following code here...






Code Snippet

SET @SQL = 'Select * FROM IdentipassNew.dbo.CBORD_Interface_Final'
SET @BCPBody = 'bcp "' + @SQL + '" queryout "d:smartcardcbordudfcbordbody.txt" -T -fc:cpbody.fmt'

Problem is, there is over 85,000 records in that set and that is too big for the text file, so I was wondering if it would be possible to select like 30,000 records output those to a text file, then select the next 30,000 and create another file, then finally get the remaing records and put that in another text file. Can someone point me in the right direction as to how to accomplish this?


Thanks in advance.

View 3 Replies View Related

T-SQL (SS2K8) :: Identify Columns Which Will Create Unique Record In A Table

Sep 15, 2014

I am looking to create a script that will go through a table a pick out the necessary columns to create a unique record. Some of the tables that I am working with have 200 plus columns and I am not sure if I would have to list every column name in the script or if they could be dynamically referenced. I am working with a SQL server that has little next to no documentation and everytime I type to mere some tables, I get too many rows back.

View 4 Replies View Related

T-SQL (SS2K8) :: Create UDF To Query A Table On A (random) Remote Server

Jan 30, 2015

I'm trying to write a function that will retrieve the last backup date/time of a particular database on a remote server (i.e. by querying msdb.dbo.backupset). Unfortunately, you can't use sp_executesql in a function, so I can't figure out a way to pass the server name to the query and still be able to return the datetime value back to the calling TSQL code (so that rules out using EXEC().

View 3 Replies View Related

T-SQL (SS2K8) :: Create Numerous Directories On Server Using Data From Table

Apr 15, 2015

SQL 2008

I have a table that has company id, attachment file name, folderexists columns.

First what I need to do is create a series of folder or directories on a networked server using the company id as the folder name where the folder name does not already exist.

Second I need to move files based on attachment file name and company id to the proper folder.

For those who want to know, this is a remediation project because of a bug in our application.

The application is supposed to created the folder based on company id and then put the attachment in that folder.

View 9 Replies View Related

Excel 2010 - Import Multiple Files To Table?

Sep 23, 2014

I have over 600+ Excel .xlsx file that I have been trying to import to Sql database table. I've been trying to complete this task with SSIS but no luck yet. I have seen several videos and read articles but when I run the package the source is validated but I always get an error in the destination. I am using Excel 2010 and SQL Server 2012.

View 3 Replies View Related

T-SQL (SS2K8) :: OR Query - Search For Items And Separate Each Word

Oct 17, 2014

I have this query currently:

select updatedb.callref, updatedb.updatetxt, updatedb.udsource, opencall.suppgroup
from updatedb
left join opencall
on updatedb.callref=opencall.callref

where udindex = '0'
and suppgroup = 'SUPPORT'
and (updatetxt like '%' + @Word + '%')

And opencall.status <> '17'This means that when they search for items and they separate each word it is "and" between each one.

They would like it to be more fuzzy with "and" and "or". How can I adapt this?

View 8 Replies View Related

How To Move The Bad Data Into Another Directory While Looping Through A Set Of FLAT FILES ?

Sep 1, 2007

Currently looping through the set of flat files like CHK0604, CHK0611, CHK0618, and CHK0625 from the source folder C:SOURCE



OBJECTIVE within the flat file if any records/rows cause error i have to move the bad data into separate folder C:ERROR



STEPS TAKEN

1) In FOREACH LOOP component i specified the variable User:: sourceFilePath for my source file CHK0604 etc. location C:SOURCE. The loop walkthrough each file in C:SOURCE and if no error then moves the flat file into another folder C:ARCHIVED. This task is perfectly working.



2) Within the dataflow I am diverting the the bad rows from "conditional component" into "Flat File Destination" Component.



3) "Flat File Destination" Connection manager i set the expressions as @[User:: sourceFilePath] +"_Error.TXT".



ISSUE

Because of point (3) the error file is created in the SOURCE flat file location C:SOURCE.



QUESTION



1) My error file name should be CHK0604_Error, CHK0611_Error, CHK0618_Error, CHK0625_Error created in another folder C:ERROR.



2) How to move the bad data into another directory while looping through a set of FLAT FILES ?

3) If i have to create another variable like @[User:: ErrorFilePath] where to create ? How to use the source file title as the title of error file.?

Thanks for the help

View 4 Replies View Related

Looping Through Several Excel Data Sources In SSIS

May 10, 2006

I am attempting to use the foreach loop structure in an SSIS package toloop through however many Excel files are placed in a directory andthen perform an import operation into a SQL table on each of thesefiles sequentially. The closest model for this that I was able to findin the MS tutorial used a flat file source rather than Excel. Thatinvolved adding a new expression to the Connection Manager that set theconnection string to the current filename, as provided by the foreachcomponent. That works just fine, but when I attempt to apply the samemethod to an Excel source, rather than a flat file source, I cannot getit to work. I see the following error associated with the Excel sourceon the Data Flow page: "Validation error. Data Flow Task: Excel Source[1]: The AcquireConnection method call to the connection manager "ExcelConnection Manager 1" failed with error code 0xC020200." I think thatit's just a matter of getting the right expression, and I thought thatperhaps I should be constructing an expression for ExcelFilePath ratherthan the Connection String, but I have fiddled with it for hours andhaven't come up with something that will be accepted. Has anybody outthere been able to do this, or can perhaps refer me to somedocumentation that contains an example of what I am trying to do?Thanks for any help you can give.

View 1 Replies View Related

SQL 2012 :: Import Excel XLSX Files Into Temp Table

Feb 18, 2014

I am having with trying to import XLSX files into SQL 2012 64 Bit.

I have installed the Access driver (AccessDatabaseEngine_x64.exe)

I have configured the script to run the following SP

sp_configure 'show advanced options', 1
GO
RECONFIGURE WITH OverRide
GO
sp_configure 'Ad Hoc Distributed Queries', 1

[Code] ....

So I first create my Temp Table

The run the SP above then I run the insert into the Temp table defined

INSERT INTO tempdb.dbo.TempTRBZ (IsNew,CoID, Zip, City, County,StateCode,Rate,Taxable,TaxShip,TaxLab,CountryID,StateID)

SELECT * FROM OPENROWSET( 'Microsoft.ACE.OLEDB.12.0','EXCEL 12.0;Database=C:TempNotInTrbzJan.xlsx;HDR=YES','SELECT * FROM [Data$]')

[Code] ....

The error message I get back is

Msg 7303, Level 16, State 1, Line 4
Cannot initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)".

What I have set wrong on the import? Using SSIS at this point is not a real option.

View 0 Replies View Related

Transact SQL :: Importing Bulk Excel Files Into A Table In 2008?

Nov 2, 2015

I have around 100 XL Files in a folder ,i want to import all the files dynamically and load all the data in a single table in sql server 2008. Without using SSIS i want to query using openrowset.

View 11 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved