SSIS Create Large Temp Files!!!

Oct 22, 2007



Hello,

I created a SSIS solution for reading data from dbase and storing them in SQL Server. In a ForEachDirectory-Loop up to one thousand dbase files are read and stored. The system where the packages are running has 16 GB RAM.
For the first few hundred dbase files everything goes fine, but then, the RAM seems not to suffice any more and a temp file is created (I changed the path in BufferTempStoragePath).

How can it be that there is a need to create temp files if there is so much RAM available?
Why is the RAM filled more and more during the SSIS package execution?
Is there anything I can do to release some of it? (it is running in a loop and there is no need to store all the data)
Could it be caused by dbase?? (I use Microsoft Jet 4.0 OLE DB Provider)

Another thing is that the temp file is not stored in the path I set in BufferTempStoragePath.
There are sufficient permissions set, but temp file is still created in user temp folder...

Any kind of help is very much appreciated!

Best Regards,
Stefan

View 5 Replies


ADVERTISEMENT

SQL 2012 :: Create Script That Will Import Large XML Files?

Jul 28, 2014

I need to create script that will import large XML files (500 - 7GB) on a daily basis and store the data in a relational db structure.

What is the best and fastest way of importing such files. I have played around with smaller files and found the following.

1. SSIS XML Data Source: It doesn't seem to like the complex elements types and throws out the file.
2. Using Bulk File Import, sorting the file in XML variable and using XQuery to parse the file: This works but it can't take a file more than 2GB in size, so I can't use this method.
3. C# + XML Serialization: This also works, but seems to be terribly slow. I open the DB connection once, so it doesn't open and close for each db call, but still seems like it takes a long time.

how to import large XML quickly in a relational table structure?

View 9 Replies View Related

T-SQL (SS2K8) :: Create Separate MS Excel Files By Looping Through Large Table

Jun 24, 2014

I have a master table containing details of over 800000 surveys made up of approximately 400 distinct document names and versions. Each document can have as few as 10 questions but as many as 150. Each question represents one row.

My challenge is to create a separate spreadsheet for each of the 400 distinct document names and versions containing all the rows and columns present in the master table. The largest number of rows would be around 150 and therefore each spreadsheet will not be very big.

e.g. in my sample data below, i will need to create individual Excel files named as follows . . .
"Document1Version1.xlsx" containing all the column names and 6 rows for the 6 questions relating to Document 1 version 1
"Document1Version2.xlsx" containing all the column names and 8 rows for the 8 questions relating to Document 1 version 2
"Document2Version1.xlsx" containing all the column names and 4 rows for the 4 questions relating to Document 2 version 1

I assume that one of the first things is to create a lookup of the distinct document names and versions assign some variables and then use this lookup to loop through and sequentially filter the master table data ready for creating the individual Excel files.

--CREATE TEMP TABLE FOR EXAMPLE

IF OBJECT_ID('tempdb..#excelTest') IS NOT NULL DROP TABLE #excelTest
CREATE TABLE #excelTest (
[rowID] [nvarchar](10) NULL,
[docName] [nvarchar](50) NULL,

[Code] .....

--Output

rowIDdocNamedocVersionquestionblankField
1document11q1NULL
2document11q2NULL
3document11q3NULL
4document11q4NULL
5document11q5NULL
6document11q6NULL

[Code] .....

View 9 Replies View Related

Large Fixed Width Text Files Using SSIS

Aug 13, 2007

What is the easiest way to get a large fixed width text file (200 columns) defintion into SSIS? To have to define each column with the ruler would be very cumbersome.

View 5 Replies View Related

ActiveX Script In A SSIS Package - Calling An FSO To Create/manipulate Files

Jul 3, 2007

I have a SQL2000 DTS package that executes vbscript to loop through a recordset which:

- runs a stored procedure and populated tables

- builds a recordset from the populated tables to write records to an Excel file

- writes status to text files with either the error or success notices



I use FSO to set up the success and error files, but the scheduled job in SQL2005 which calls the SSIS package returns the following error:

"Retrieving the file name for a component failed with error code 0x0015F74C"



I can successullly run this (vbscript) in both the SSIS package via the BI Development Studio and in MS Access (exactly the same code in both) - but not as a SSIS package called in a scheduled job in SQL2005.



I am at an impasse with this ... any and ALL assistance would be GREATLY appreciated.



TIA,



Bob

View 1 Replies View Related

Unable To Extend Temp Segment By 64 In Tablespace TEMP (SSIS Error While Copying Data From Oracle)

Oct 22, 2007

I am transferring data from oracle and getting below error message.

I using 4 data flow tasks with in a single control flow and all the 4 tasks quueries same table but populates data in to different sql tables based on the where contidion

[OLE DB Source 1 [853]] Error: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft OLE DB Provider for Oracle" Hresult: 0x80004005 Description: "ORA-01652: unable to extend temp segment by 64 in tablespace TEMP ".

View 4 Replies View Related

Temp Tables Vs. Large Table

Aug 4, 2005

I have a few hundred users, maybe a dozen or two active at any given time, accessing the same database via ASP. The database has many tables, one being a very large orders table with a few million records, in which I have created a view against. A view only because I need to allow the user to filter quite extensively against the results. The users typically only need to view records for the last 30 days and results for each user might be five thousand records or less.

My question is this. Would I be better off writing each user's resultset to a temp table for that user's session and allow the filtering and sorting by the user go against that temp table and increase my hardware requirements to accomodate that. Possibly to the point of creating a database cluster. OR would I be better off leaving it as is where each users uses the same view.

FYI...each user may need visibility to only a hand full of fields, but over all the view must maintain many fields.

Any thoughts on this would be greatly appreciated. Thanks in advance.

Dave

View 2 Replies View Related

Transact SQL :: Select 1000 Rows At A Time From / Into A Large Temp Table?

May 12, 2015

I am using SQL SERVER 2008R2, not Denali, so I cannot use OFFSET FETCH Clause.

In my stored procedure, I am doing a SELECT INTO #tblTemp FROM... Working fine. This resultset is going to be used in an SSIS package which will generate a pipe-delimited .txt file... Working fine.

For recoverability sake, I am trying to throttle back on the commit chunks to 1000 rows per commit until there are no more rows. I am trying to avoid large rollbacks.

Q: Am I supposed to handle the transactions (begin/commit/rollback/end trans) when the records are being inserted into the temp table? Or when they are being selected form the temp table?

Q: Or can I handle this in my SSIS package for a flat file destination? I don't see option for a flat file destination like I do for an OLE DB Destination (like Rows per batch, Maximum insert commit size).

View 6 Replies View Related

Temp DB Files

Jan 2, 2000

Using SQL Server 7 I have the following files hanging out and I'm not sure what they are for or if I can delete them.

My data and logs are on the same RAID 5 drives. They are located in the E:DATA and E:LOGS directories.
I have the following files on the root of my E: drive

tempdbData.ndf
tempdbLog.ndf

tempdbData1.ndf
tempdbLog1.ndf

this goes on up to tempdbData4.ndf. The data files are large up to 2 GBytes. The dates are not current except for the last 2 sequential files.

Anybody know what they are there for and my I delete any of these files?

Thanks and Thanks again......

View 1 Replies View Related

Log Files Too Large.

Oct 20, 2000

Hi,

I have inherited some databases whith extremely large Log files.
I tried the truncate transaction log but did not work.
Can some body please tell me how to truncate these log files.

Thanks in advance.

Attaullah

View 2 Replies View Related

Cannot Shrink Temp DB Files

Nov 6, 2015

I re-started the SQL Service.

I did numerous commands with no luck.

Shrink DB, Shrink files and shrink DB.

I tried the GUI but it bombs out.

View 2 Replies View Related

Best Way To Clean Up Temp Files

Mar 13, 2006

I have a custom Data Flow task that creates temp files to the system temp directory during processing. A lot of times, we'll use SSIS to do one data transformation, running and tweaking the package along the way... we do this in the designer ... if we notice something that's incorrect in the data view, we just hit the stop button and fix it. However, when we do this, the Cleanup() function isn't called, and my temp files are left in the temp directory, when they really ought to be disposed of.

Is there a method that gets called every time when the DtsDebugHost quits, whether it finished, didn't finish properly, or was stopped in the middle? What would be a good way (other than having some service that monitors what temp files are used by what processes) to clean up temp files after we don't need them?

~Steve

View 1 Replies View Related

DTS Large Flat Files

Dec 5, 2000

I have some Large flat fiiles that I need to export to my SQL Server database. The file sizes range from 16 MB to 116 MB. I've tried to save the files to an excel sread sheet and then export them in that format, but that didn't work. does anyone have any suggestions?

View 1 Replies View Related

Large Transaction Log Files

Apr 4, 2007

i have a few tables using Sql Server 2005 Express. currently they are holding roughly 30-40k records in them. i have my log files set at restricted growth to 90 megs. while im not close to reaching that, i would like my tables to be able to scale up to possibly millions of records. based on that, i figure the transaction log file will prolly need to have a higher threshold (unrestricted growth). for those with experience, for tables that have millions of records, what are the average size log files i could expect.
is it a bad idea to just shrink the log file every night during off peak hours so that regardless of the amount of records i have, ill always start the day with a minimal log file?
do large log files have any effect on SQL performance?

View 3 Replies View Related

Extremely Large Log Files?

Aug 16, 2006

We have SQL Server running on a Windows 2003 server, only because Backup Exec requires it. AT the location : C:Program FilesMicrosoft SQL ServerMSSQLData
there is this file: SuperVISorNet_log.LDF which is 15 Gb and is accessed daily. I apologize because I don't know what this is!

My question is: can this file be 'pruned' (for want of a better word) because it's taking up a lot of backup space.

View 17 Replies View Related

Log Files On Large Transactions

Jan 25, 2008

I am trying to run a query that deletes duplicates records on a table with 24m records. The problem is each time I run it the log file fills up and I get an error saying the log file is full. For this reason the query never ends.

Is there anyway to turn of logging when running a query?

I think it also has to do with disk drive runng out of space as the log file is growing to over 12gb.

It is running in simple mode already.

View 11 Replies View Related

Why Have SQLCE... Files Under Temp Folder

Mar 14, 2007

I am developing application with SQLCE2.0, NETCF1.0,Sp1,VS2003.

I found there is some files are created under "Temp" folder by the system with size "0B".

Why/when these file are created? Do I need to clean it periodicly? If not, will this cause exception like "Not enough storage is available to complete this operation"?



Thanks.

View 10 Replies View Related

Optimize For Many Tables And Temp Files

Mar 8, 2006

We are using the Import/Export wizard to create some simple packages to transfer tables. When doing a lrge number of tables, the 'Optimize for many tables' option is automatically selected (as noted in BOL). What we've found is that the package creates a bunch of temp files in the creator's Documents and Settings....Temp folder. Needless to say this package cannot be re-run later, nor scheduled, since the path referenced doesn't necessarily exist on the server. Is there a way to specify where these files should be created so thatthe package is re-usable and still be optimized?

Steve

View 1 Replies View Related

Linq To SQL Varbinary For Large Files

Jun 11, 2008

Hello,
I have decided to use Linq for my current ASP.NET project and so far it has been good, but now I am implementing a system that will allow users to upload binary content such as pictures and  videos. For ease of management and security, I have decided to store this content directly in the database. The performance hit is a minor concern because very few user-uploaded images/videos will be seen on any given page (usually just one).
From the limited tutorials I have seen on the internet, Linq supports the SQL Server varbinary column through its System.Linq.Binary class. This class does not appear to support STREAMS and instead opts to load all of the contents into memory. This content can then be converted to an array of bytes, which can then be output to the browser via the response stream. This is not good. What if I am sending a video that is very large? Varbinary supports up to 2 GB. I can't have a 2 GB video sitting in memory. It makes a lot more sense to stream it via a small buffer.
Obviously, I am going to limit the size of the content that users can upload, but the core problem remains. If I limit content size to 2 MB and I have 2 GB of memory on the server, then I can only serve 1000 users concurrently. In reality, that number would be much less because of other processes running on the server.
Is there no way to stream data from a varbinary column with Linq using a small buffer of bytes?
Do I need to implement some custom logic on my Linq classes? Since these classes are automatically generated, how would I do such a thing?
Thanks.

View 1 Replies View Related

Insert Large Text Files Into Db

Apr 9, 2008

how do i insert a large chunk of text into a table column. my project is to build a news website. where people can go and read news articles. the articles are provided by the author in word format, so how do i insert that news article into the table's column? any help would be appreciated


thanks

View 2 Replies View Related

Sort Transformation Makes A Lot Of Temp Files

May 29, 2007

Hi all,



I have a problem with a Sort Transformation, I have a CSV file with 200'000 rows the csv file is about 30Mb. When the rows are processed in the Sort Transformation, SSIS generates around 160 temp files of about 10Mb each.



How can avoid so many temp files to be generated ?

View 4 Replies View Related

Storing Large Files In The SQL Server 2005

Feb 9, 2006

I have a table that I'm inserting a file into and using the Image data type to store the binary object.  Now the code below works fine for files around 1.5 MB, but anything larger and it's like the code won't even execute and I get a Page Not found error.
I'm in the process of running some traces to find out what's going on in the backend, but I'm assuming there's something amiss with my code.  The Image data type should handle files that size with no problem but for some reason it isn't.
Does anyone see anything wrong?
Thanks
Dim iLength As Integer = CType(File1.PostedFile.InputStream.Length, Integer)
If iLength = 0 Then Exit Sub 'not a valid file
Dim sContentType As String = File1.PostedFile.ContentType
Dim sFileName As String, i As Integer
Dim bytContent As Byte()
ReDim bytContent(iLength) 'byte array, set to file size

'strip the path off the filename
i = InStrRev(File1.PostedFile.FileName.Trim, "")
If i = 0 Then
sFileName = File1.PostedFile.FileName.Trim
Else
sFileName = Right(File1.PostedFile.FileName.Trim, Len(File1.PostedFile.FileName.Trim) - i)
End If
conn = New SqlConnection(eco)
conn.Open()
cmd = New SqlCommand("INSERT INTO ECO_Attachments (ECOID, FromType, DocName,OldRev,NewRev,NtLogin,DisplayName, FileName, FileSize, FileData, ContentType) VALUES (@ECOID, @FromType,@DocName,@OldRev,@NewRev,@NtLogin,@DisplayName, @FileName, @FileSize, @FileData, @ContentType) ")
cmd.Connection = conn
Try
File1.PostedFile.InputStream.Read(bytContent, 0, iLength)
With cmd
.Parameters.Add("@ECOID", SqlDbType.Int)
.Parameters.Add("@FromType", SqlDbType.NVarChar, 50)
.Parameters.Add("@DocName", SqlDbType.NVarChar, 250)
.Parameters.Add("@OldRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NewRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NTLogin", SqlDbType.NVarChar, 100)
.Parameters.Add("@DisplayName", SqlDbType.NVarChar, 200)
.Parameters.Add("@FileName", SqlDbType.NVarChar, 255)
.Parameters.Add("@FileSize", SqlDbType.Real)
.Parameters.Add("@FileData", SqlDbType.Image)
.Parameters.Add("@ContentType", SqlDbType.NVarChar, 50)
.Parameters("@ECOID").Value = ECOID
.Parameters("@FromType").Value = From
.Parameters("@DocName").Value = DocName
.Parameters("@OldRev").Value = OldRev
.Parameters("@NewRev").Value = NewRev
.Parameters("@NTLogin").Value = NTLogon
.Parameters("@DisplayName").Value = DisplayName
.Parameters("@FileName").Value = sFileName
.Parameters("@FileSize").Value = iLength
.Parameters("@FileData").Value = bytContent
.Parameters("@ContentType").Value = sContentType
.ExecuteNonQuery()
'.ExecuteScalar()
End With
Catch ex As Exception
Response.Write(ex)
'Handle your database error here
conn.Close()
End Try

View 1 Replies View Related

Loading Large Flat Files Into A SQL Database Using DTS

Dec 7, 2000

Here's my delema, I have a file that's 308 bytes wide by 5.7 million records. The record length is fixed and the position and width of the known within the record. When I run DTS I recieve this error Msg MS DTS flat file provide and Err Diesdription: error creating file mapping view: not enough storage is available to process this command. Then when I try to continue with the wizard, it will not allow me to separate the data into the format that I need. Is there any other way to import this file using DTS?

View 1 Replies View Related

C Part Running Out Of Room, Large MDF, LDF Files

Sep 20, 2005

Hi my data files sit in the default directories and I think they are causing my partition to run out of space. I mainly use one db that I created but don't use the others (ie master, model, tempdb, etc). Yet I see their MDF and LDF files are growing. What can I do to shrink them down or perhaps move them off to a larger partition after shrinking?

View 6 Replies View Related

Reading Large Text Files With 2005 CE?

Dec 19, 2007

Hi€¦
During my web search looking for a solution I ran across SQL CE 3.5 articles. My questions about SQL CE 3.5 are:
1) Can SQL CE 3.5 handle a 4 €“ 6 GB file
- Read
- Parse (SQL)
2) Can SQL CE 3.5 act as a standalone client that a user can view a large (4-6 GB) text file?
- Will I need a .NET (small) client to read the large (4-6 GB) text file?
More info:
The text file will reside on the machine where the SQL CE 3.5 is installed. There is no pull to get the data.

Thank you (in advance)€¦

SQL CE 3.5

View 3 Replies View Related

SQL 2012 :: Import Excel XLSX Files Into Temp Table

Feb 18, 2014

I am having with trying to import XLSX files into SQL 2012 64 Bit.

I have installed the Access driver (AccessDatabaseEngine_x64.exe)

I have configured the script to run the following SP

sp_configure 'show advanced options', 1
GO
RECONFIGURE WITH OverRide
GO
sp_configure 'Ad Hoc Distributed Queries', 1

[Code] ....

So I first create my Temp Table

The run the SP above then I run the insert into the Temp table defined

INSERT INTO tempdb.dbo.TempTRBZ (IsNew,CoID, Zip, City, County,StateCode,Rate,Taxable,TaxShip,TaxLab,CountryID,StateID)

SELECT * FROM OPENROWSET( 'Microsoft.ACE.OLEDB.12.0','EXCEL 12.0;Database=C:TempNotInTrbzJan.xlsx;HDR=YES','SELECT * FROM [Data$]')

[Code] ....

The error message I get back is

Msg 7303, Level 16, State 1, Line 4
Cannot initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)".

What I have set wrong on the import? Using SSIS at this point is not a real option.

View 0 Replies View Related

I Can't Create A Temp Table

Jul 20, 2005

Hi all!I have a problem with a temp table.I start creating my table:bdsqlado.execute ("CREATE TABLE #MyTable ...")There is no error. The sql string has been tested and when it'sexecuted in the sql query analyzer it really creates the table.After creating the table, I execute an insert statement:bdsqlado.execute ("INSERT INTO #MyTable VALUES(...) "It returns an error like this: "Invalid Object Name #MyTable"I don't understand what's wrong. If I execute both sql sentences inthe SQL Query Analyzer it works perfectly.I use the same connection to execute both statements and I don't closeit before the INSERT is executed.I think it may be something related to dynamic properties of theconnection, but I'm not sure. It's just an idea.Please I need help.Thanks a lot,

View 1 Replies View Related

Create Temp Talbe

Jul 20, 2005

I'm trying to create a temp table in the stored procedure.the syntex is the following:create st_procvariables declaredif something >0create temp table #Table1if something <0create temp table #Table1SQL Compiler complains about the second create.thanks

View 1 Replies View Related

Dividing A Large Flat File Into Small Files

Jul 16, 2007

Hi ,

Is there any method by which I can divide the large flat file into certain number of small files keeping the header in each of the sub files?

Regards,

Prash

View 4 Replies View Related

How To Distribute A Large Single File Database Into Multiple Files?

Aug 29, 2007

I have several databases that have grown to 300 GB and would like to distribute the data into multiple files across multiple drives. Can I create a new database that is spread across the new drives and use a full backup to restore or am I stuck with unloading the data table by table?

View 3 Replies View Related

RESTORE DATABASE Timeout In SQL 2000 With Large Backup Files

Sep 11, 2007



Hello,

I am attempting to restore the database from within VB.NET application I am making the following 3 calls:

RESTORE FileListOnly FROM DISK = 'C:MyDatabase.dat'

USE Master RESTORE DATABASE MyDatabase FROM DISK = 'C:MyDatabase.dat' WITH NORECOVERY, MOVE 'MyDatabase' TO 'C:Program FilesMicrosoft SQL ServerMSSQLDataMyDatabase.mdf', MOVE 'MyDatabase_log' TO 'C:Program FilesMicrosoft SQL ServerMSSQLDataLDFMyDatabase.ldf', REPLACE

RESTORE DATABASE MyDatabase FROM DISK = 'C:MyDatabase.dat'


using SMO. This logic works fine with small *.dat files, however when using *.dat file of about 4Gb I get an error on the 3d restore database call:



ExecuteNonQuery failed for Database 'master'.

An exception occurred while executing a Transact-SQL statement or batch.

Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.

Operator aborted backup or restore. See the error messages returned to the console for more details.

ExecuteNonQuery failed for Database 'master'.

An exception occurred while executing a Transact-SQL statement or batch.

Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.

Operator aborted backup or restore. See the error messages returned to the console for more details.



The same program/logic also works fine when I use MS SQL 2005 and it runs fine from MS SQL 2005 Query Analyzer for both 2005 and 2000 databases. There seem to be only problem with MS SQL 2000 from within VB.NET. Anybody has any idea? I'd appreciate any response. Thanks

Eugene

View 6 Replies View Related

Using A Variable To Create Temp Table

Mar 3, 2003

Can someone send me an example of creating a variable to use instead of a temp table? I cannot find an example on books on line, but know it is possible in SQL2000.

Thanks,
Dianne

View 2 Replies View Related

CREATE A Temp Table Via EXEC (@SQL)

Jan 23, 2006

I need to create a dynamic temporary table in a SP. Basically, I am using the temp table to mimic a crosstab query result. So, in my SP, I have this:--------------------------------------------------------------------------------------- Get all SubquestionIDs for this concept-------------------------------------------------------------------------------------DECLARE curStudySubquestions CURSOR LOCAL STATIC READ_ONLY FOR SELECT QGDM.SubquestionID, QGDM.ShortName, QGDM.PosRespValuesFROM RotationMaster AS RM INNER JOIN RotationDetailMaster AS RDM ON RM.Rotation = RDM.Rotation INNER JOIN QuestionGroupMaster AS QGM ON RDM.QuestionGroupNumber = QGM.QuestionGroupNumber INNER JOIN QuestionGroupDetailMaster AS QGDM ON QGM.QuestionGroupNumber = QGDM.QuestionGroupNumberWHERE RM.Study = @StudyGROUP BY QGDM.SubquestionID, QGDM.ShortName, QGDM.PosRespValuesHAVING QGDM.SubquestionID <> 0--------------------------------------------------------------------------------------- Dynamically create a Temp Table to store the data, simulating a pivot table-------------------------------------------------------------------------------------SET @Count = 2SET @SQL = 'CREATE TABLE #AllSubquestions (Col1 VARCHAR(100)'OPEN curStudySubquestionsFETCH NEXT FROM curStudySubquestions INTO @SubquestionID, @ShortName, @PosRespValuesWHILE @@FETCH_STATUS = 0BEGIN SET @SQL = @SQL + ', Col' + CAST(@Count AS VARCHAR(5)) + ' VARCHAR(10)' SET @Count = @Count + 1 FETCH NEXT FROM curStudySubquestions INTO @SubquestionID, @ShortName, @PosRespValues ENDSET @SQL = @SQL + ', ShowOrder SMALLINT)'CLOSE curStudySubquestionsPRINT 'Create Table SQL:'PRINT @SQLEXEC (@SQL)SET @ErrNum = @@ERROR IF (@ErrNum <> 0) BEGIN PRINT 'ERROR!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!' RETURN ENDPRINT '*** Table Created ***'-- Test that the table was createdSELECT *, 'TEST' AS AnyField FROM #AllSubquestions The line PRINT @SQL produces this output in Query Analyzer (I added the line breaks for forum formatting):CREATE TABLE #AllSubquestions (Col1 VARCHAR(100), Col2 VARCHAR(10), Col3 VARCHAR(10), Col4 VARCHAR(10), Col5 VARCHAR(10), Col6 VARCHAR(10), Col7 VARCHAR(10), ShowOrder SMALLINT) However, the SELECT statement to test the creation of the table produces this error:*** Table Created ***Server: Msg 208, Level 16, State 1, Procedure sp_SLIDE_CONCEPT_AllSubquestions, Line 73Invalid object name '#AllSubquestions'. It appears that the statement to create the table works, but once I try to access it, it doesn't recognize its existance. Any ideas?

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved