SQL 2012 :: Generating CREATE TABLE Scripts For Large Number Of Tables

Feb 11, 2014

Other than right-clicking on each individual table in SSMS and generating a CREATE script, is there a simple way to generate CREATE TABLE scripts for tables within a given database?

Background: I have a bunch of tables in one database, and I would like to add tables to a second database that have the same names and basic structures of some of the tables from the first database.

I do not need to transfer any data from the tables, this is a seperate project that will use a similar data structure. I just want to generate the CREATE TABLE scripts for 30ish tables within the first database, and then I'll tweak the scripts as appropriate and run them against the new database.

[URL] ....

View 7 Replies


ADVERTISEMENT

SQL 2012 :: Create Clustered Index On A Very Large Table (500 GB)

May 7, 2014

I need to create a Clustered Index (CI) on a very large SQL Server 2012 database table. This table has about approximately 10 billion rows, 500 GB in size. The job ran for about 20 hours into it and then fails with error: "Out of disk space in tempdb". My tempDB size is 1.8TB, but yet it's still not enough.

Here is my script:

CREATE CLUSTERED INDEX CI_IndexName
ON TableName(Column1,Column2)
WITH (MAXDOP= 4, ONLINE=ON, SORT_IN_TEMPDB = ON, DATA_COMPRESSION=PAGE)
ON sh_WeekDT(Day_DT)
GO

View 9 Replies View Related

Large Number Of Tables And Performance

Jan 25, 2008

Hi gurus, I'm creating a web application where I will have a large number of tables (between 10k and 20k), this is done for the sake of scalability as tables will be moved to different database servers as the application grows and also for performance (smaller indexes). I'm worried though how having a large number of tables could affect the performance of SQL Server as the application will start on one single database server. I tried to find some resources on that on the internet but couldn't find any.

I would really appreciate if you can give me some advice and if you have any good links that would be great...

View 10 Replies View Related

Large Number Of Tables And Performance

Jan 25, 2008

Hi gurus, I'm creating a web application where I will have a large number of tables (between 10k and 20k), this is done for the sake of scalability as tables will be moved to different database servers as the application grows and also for performance (smaller indexes). I'm worried though how having a large number of tables could affect the performance of SQL Server as the application will start on one single database server. I tried to find some resources on that on the internet but couldn't find any.

I would really appreciate if you can give me some advice and if you have any good links that would be great...

Waleed Eissa
http://www.waleedeissa.com

View 9 Replies View Related

SQL Server 2012 :: Delete Large Number Of Records?

Sep 8, 2015

I have the following scenario:

SQL database on SQL 2012

Large Production table 15 Million record

The table has 3 years of data

New monthly data is being added every month.

A New Monthly data is being loaded, checked and finally approved after 6 or 7 iteration before approval.Because of this iteration the monthly data set is being added then deleted then added then deleted few times.Because the table is big this process takes time, any thoughts on how to make the delete insert process faster.Keep in mind I cannot do much because it is a production table and is being access by other users to do other analysis.

Delete is done based on trx_date which is a year/month combo, like 201508.

The table has monthly sales by customer aggregated.

The table structure is:

CREATE TABLE [dbo].[Sales](
[batch_key] [int] NOT NULL,
[Company_key] [int] NOT NULL,
[customer_key] [char](22) NOT NULL,
[Trx_Date] [int] NOT NULL,
[account] [nvarchar](35) NOT NULL,

[code].....

View 9 Replies View Related

How To Manage IDENTITY In Tables With Large Number Of Rows?

May 23, 2002

Table structure: col1 IDENTITY (seed=1 increment=1) + few other columns (col2...col7) + one text column (col 8)
I have around 50,000,000 rows per day inserted in the table T1. At the end of the day 40,000,000 rows are deleted. I have to keep the records for 12 months and then archive it. Database is 24/7 web serving and there is no down time allowed. IDENTITY column will go out of range (overflow) after less than two years, unless the identity seed is reset to the start value (seed=1, increment=1).
At the end of 12th month data is archived in another table and only last month is kept in the table T1. So table T1 enters new year with data from last month of the previous year. There are few other tables that refer to this table by using there own field with values from T1.IDENTITY column (referential integrity is not enforced). Identity column in T1 is needed as a unique id for some search actions. Performance is an issue therefore bigint data type is used for this identity column rather than decimal.

Another problem I have is how to do table update on one column (1 mil rows to be updated out of 2 mil of rows) with the minimum impact on the users who are querying this table heavily. Not need to mention that it is web app 24/7 no down time.

Thank you in advance.


Goran

View 1 Replies View Related

SQL 2012 :: Create Script That Will Import Large XML Files?

Jul 28, 2014

I need to create script that will import large XML files (500 - 7GB) on a daily basis and store the data in a relational db structure.

What is the best and fastest way of importing such files. I have played around with smaller files and found the following.

1. SSIS XML Data Source: It doesn't seem to like the complex elements types and throws out the file.
2. Using Bulk File Import, sorting the file in XML variable and using XQuery to parse the file: This works but it can't take a file more than 2GB in size, so I can't use this method.
3. C# + XML Serialization: This also works, but seems to be terribly slow. I open the DB connection once, so it doesn't open and close for each db call, but still seems like it takes a long time.

how to import large XML quickly in a relational table structure?

View 9 Replies View Related

T-SQL (SS2K8) :: Generating One Table From Two Tables

Jan 14, 2015

There are two tables as below:

Table 1
IDValue
F001A,B,C
F002B,C,D
F003A,C

Table 2
IDValue
D001A
D002B
D003C
D004D

what is best way to generate one table as below:

New table
F001D001
F001D002
F001D003
F002D002
F002D004
F002D003
F003D001
F003D003

View 4 Replies View Related

SQL 2012 :: Index Maintenance For Large Tables?

Mar 8, 2014

We are having very big tables in TBS and wanted to setup a strategy for index maintenance.

View 3 Replies View Related

Managing Large Number Of Objects (table,sp,view)

Jul 28, 2007

Hi,
Our database has very large number of objects. We have a naming convension by modules, subprojects etc. But for example when we need to open a specific table it still takes time to find it. If we could create custom folders under table folder or stored procedure folder it will be easier to find an object. We could create sub folders by module, subproject and classify our objects with these folders. Will the next version SQL Server 2008 support this kind of functionality?

View 8 Replies View Related

Add Column To Existing Table With Large Number Of Rows

Dec 24, 2007

Hey Guys

i need to add a datetime column to an exisitng table that has like 1.2 million records and its being accessed frequently
but i cant afford to stop the db at all

whenever i do : alter table mytable add Updated_date datetime

it just takes too long and i have to stop executing the query after a couple of mins
I am running sql express 2005 sp2. db size is over 3 gb but still under the 4 gb limit

can u plz advice on how to add this column. its urgent!!

thanks in advance

View 5 Replies View Related

SQL Server 2012 :: Select Large Data From Multiple Tables

May 10, 2014

In a Library Management database we have these tables

1) Document ( DocNo , Doc_type , permalink,inDate)
2)Title(id, DocNo,Main_Title, Other_Title)
3)Author(id , Author_Name , Author_Family,Type--Like:main author , translator ,....)
4)Publisher(id,DocNo , Name,Publisedate,address)
5)Subject(id,DocNo,Subject)
6)Description(id,DocNo,ISBN,description)--one document may have some ISBN,etc

In document table I have 500,000 records.

I want to search a word in these tables ,for example i want to search 'Computer' ,this word may be in subject or title or description and etc. How can I do this with best performance?

View 3 Replies View Related

SQL Server 2012 :: Merging Two Large Tables (More Than 100m Rows)

Aug 18, 2014

SQL 2012

I have a source table in the staging database stg.fact and it needs to be merged into the warehouse table whs.Fact.

stg.fact is not a delta feed it is basically an intra-day refresh.

Both tables have a last updated date so its easy to see which have changed.

It will be new (insert) or changed (update) data that I am interested in, there are no deletions.

As this could be in the millions of rows that are inserts or updates then this needs to be efficient.

I expect whs.Fact to go to >150 million rows.

When I have done this before I started with T-SQL Merge statement and that was not performant once I got to this size.

My original option was to do this is SSIS with a lookup task that marks the inserts and updates and deal with them seperately. However I set up the lookup tranformation the reference data set will have a package variable in the SQL commnd. This does not seem possible with the lookup in 2012! Currently looking at Merge Join transformation and any clever basic T-SQL that could work as this will need to be fast, and thats where I think that T-SQL may be the better route.

Both tables will have >100,000,000 rows
Both tables have the last updated date
The Tables are in different databases but on the same SQL Instance
Each table holds 5 integer columns, one Varchar, one datatime

Last time I used Merge it was a wider table with lots of columns so don't know if this would be an option.

View 6 Replies View Related

CREATE INDEX On Large Table

Jul 23, 2005

SQL Server 7/2000: We have reasonably large tables (3,000,000 rows)that we need to add some indexes for. In a test, it took over 12 hoursto CREATE a new INDEX against this table. One of us suggested that wecreate a temp table with the new index and copy the data from the oldtable into the new one, then rename it. I understand this took 15minutes. Why the heck would it be faster to move the data and buildmultiple indexes incrementally vs adding an index??

View 11 Replies View Related

SQL Server 2012 :: Create Row Number For A Consecutive Action

Nov 26, 2013

How to create a row number for a consecutive action. Example: I have a listing of people who have either completed a goal or not. I need to count by person the number of consecutively missed goals.

My sql table is this:
PersonId, GoalDate, GoalStatus (holds completed or missed)

My first thought was to use the rownumber function however that doesn’t work because someone could complete a goal, miss a goal, then complete one and when they complete a goal after a missed goal the count has to start over.

View 9 Replies View Related

SQL Server 2012 :: Adding A Number To A String To Create Series

Sep 3, 2014

add a number to the end of an ID to create a series.For example, I have an EventID that may have many sub events. If the EventID is 31206, and I want to have subEvents, I would like have the following sequence. In this case, lets say I have 4 sub Events so I want to check the EventID and then produce:

312061
312062
312063
312064

How can I check what the EventID is, then concatenate a sequence number by the EventID?

View 9 Replies View Related

Dynamically Generating Large Views

Nov 24, 1999

Hi,

I need to regenerate a large view each month which performs a union across a number of partitioned tables. (DB is SQL Server 7)
The select statement for each table within the union is large and cannot be replaced with 'select *'.
One possible approach is to obtain the current view definition from syscomments and append an additonal 'UNION select ... from ...' to the end. However I'm having difficulties getting the full textual definition back into variables with the stored procedure. Getting the view definition from the information schema returns a truncated version.
Is there a way to get the whole of the description back (may be >8k)?

I have similar problems manipulating large text fields within the stored proc. If I try to build the entire view definition each period I hit the problem that EXEC cannot take a TEXT parameter (and a varchar isn't big enough).

There are some (messy) ways around this problem, but has anyone had a similar situation where large statements have have to built up before passing them to an EXEC statement and found a neat solution?

View 1 Replies View Related

SQL Server 2012 :: Manage Concurrency When Users Need To Create Invoice Number

May 31, 2014

I have a db to manage the creation of invoice number designed for a web application.

My problem is how to manage the concurrency when the users need to create an invoice number.

View 9 Replies View Related

4 Seperate Tables Or One Large Table?

May 10, 2008

I have 4 tables with the respective amount of records
1) 6755
2) 2021
3) 2021
4) 355

They all have the same columns. However, they need to be seperate, or at least when I query them. I'll be accessing this database via the web. i was first afraid that a large database would cause major slow down when accessing the db. So I broke it up into 4 tables. If I combined all 4 tables into one large table and just had a column that differentiated the 4, how significant would be the change in speed when accessing the table? It's not a big deal to keep them seperate, its just that when I have to add or remove a column from one table I have to remove it from all the tables. Furthermore, I'm using a module from DEVEXPRESS, don't know if anyone has heard of it, but when you use a gridview, it loads up the entire table even though your paging (which I think is retarded), so for that reason I was afraid it would slow up my access to the db. Any thoughts?

View 2 Replies View Related

Temp Tables Vs. Large Table

Aug 4, 2005

I have a few hundred users, maybe a dozen or two active at any given time, accessing the same database via ASP. The database has many tables, one being a very large orders table with a few million records, in which I have created a view against. A view only because I need to allow the user to filter quite extensively against the results. The users typically only need to view records for the last 30 days and results for each user might be five thousand records or less.

My question is this. Would I be better off writing each user's resultset to a temp table for that user's session and allow the filtering and sorting by the user go against that temp table and increase my hardware requirements to accomodate that. Possibly to the point of creating a database cluster. OR would I be better off leaving it as is where each users uses the same view.

FYI...each user may need visibility to only a hand full of fields, but over all the view must maintain many fields.

Any thoughts on this would be greatly appreciated. Thanks in advance.

Dave

View 2 Replies View Related

SQL Server 2012 :: Purge Process On A Large Table

Jan 9, 2014

I am attempting to do a rather simple purge task on a very large table. This task will need to take place daily and delete records older than 6 months out of the database. On first pass this will delete well over 130 million rows. I thought the best way to handle this is create a proc and call the proc from a SQL Agent Job that runs nightly. Here is an example of the script:

CREATE PROCEDURE usp_Purge_WCFLogger
AS
SET NOCOUNT ON
EXEC sp_rename 'dbo.logs', 'logs_work'
GO
SELECT * INTO dbo.Logs_Backup FROM dbo.Logs_Work WHERE TIMESTAMP < DATEADD(month, -6, GETDATE())

[Code] .....

View 3 Replies View Related

SQL 2012 :: Partitioning Large Table On Nullable Date

May 15, 2014

I have a very large table that I need to partition. Ideally the table will write to three filegroups. I have defined the Partition function and scheme as follows.

CREATE PARTITION FUNCTION vm_Visits_PM(datetime)
AS RANGE RIGHT FOR VALUES ('2012-07-01', '2013-06-01')
CREATE PARTITION SCHEME vm_Visits_PS
AS PARTITION vm_Visits_PM TO (vm_Visits_Data_Archive2, vm_Visits_Data_Archive, vm_Visits_Data)

This should create three partitions of the vm_Visits table. I am having a few issues, the first has to do with adding a new clustered index Primary Key to the existing table. The main issue here is that the closed column is nullable (It is a datetime by the way). So running the following makes SQL Server upset:

ALTER TABLE dbo.vm_Visits
ADD CONSTRAINT [PK_vm_Visits] PRIMARY KEY CLUSTERED
(
VisitID ASC,
Closed
)
ON [vm_Visits_PS](Closed)

I need to define a primary key on the VisitId column, but I need to include the Closed column in order to partition on it.how I would move data between partitions on a monthly basis. Would I simply update the Partition function, or have to to some sort of merge, split, or switch function?

View 2 Replies View Related

SQL Server 2012 :: How To Batch Delete Large Table

Jun 16, 2015

I have a table with about 466 Million rows. In this table there is a int column called WeeksToRetain as well as a EventDate column containing the date the row was inserted. I am trying to delete all the rows that that should be deleted according to the WeeksToRetain. For example, if the EventDate is 5/07/15 with a 1 in the WeeksToRetain column the row should be removed by 5/14/15. I am not sure what days SQL considers the beginning and end of the week. However the core issue I am having is the sheer mass of deletions I must do and log growth.

So I am trying to do the delete in batches. More specifically I want to load a temporary table with a million rows, then use the temporary table to load a sub temporary table with 100,000 rows and join this temporary table to the table I want to delete from looping through 10 times to get 1 million. The Logging.EvenLog table which is the table I'm trying to purge has a clustered index on EventDate (ASC). I would like to run this in a schedule job with enough time between executions for log backups to run.

DECLARE @i int
DECLARE @RowCount int
DECLARE @NextBatchDate datetime
CREATE TABLE #BatchProcess
(
EventDate datetime,
ApplicationID int,

[Code] .....

View 9 Replies View Related

T-SQL (SS2K8) :: Create Separate MS Excel Files By Looping Through Large Table

Jun 24, 2014

I have a master table containing details of over 800000 surveys made up of approximately 400 distinct document names and versions. Each document can have as few as 10 questions but as many as 150. Each question represents one row.

My challenge is to create a separate spreadsheet for each of the 400 distinct document names and versions containing all the rows and columns present in the master table. The largest number of rows would be around 150 and therefore each spreadsheet will not be very big.

e.g. in my sample data below, i will need to create individual Excel files named as follows . . .
"Document1Version1.xlsx" containing all the column names and 6 rows for the 6 questions relating to Document 1 version 1
"Document1Version2.xlsx" containing all the column names and 8 rows for the 8 questions relating to Document 1 version 2
"Document2Version1.xlsx" containing all the column names and 4 rows for the 4 questions relating to Document 2 version 1

I assume that one of the first things is to create a lookup of the distinct document names and versions assign some variables and then use this lookup to loop through and sequentially filter the master table data ready for creating the individual Excel files.

--CREATE TEMP TABLE FOR EXAMPLE

IF OBJECT_ID('tempdb..#excelTest') IS NOT NULL DROP TABLE #excelTest
CREATE TABLE #excelTest (
[rowID] [nvarchar](10) NULL,
[docName] [nvarchar](50) NULL,

[Code] .....

--Output

rowIDdocNamedocVersionquestionblankField
1document11q1NULL
2document11q2NULL
3document11q3NULL
4document11q4NULL
5document11q5NULL
6document11q6NULL

[Code] .....

View 9 Replies View Related

SQL 2012 :: Query To Understand Names Of All Available Tables / Number Of Records

Aug 31, 2014

SQL query to understand the names of all the available tables , number of records in these tables and size of these tables?

View 4 Replies View Related

SQL Server 2012 :: Maximum Number Of Global Temporary Tables?

Dec 9, 2014

What is the maximum no.of global temporary tables can create in sql server

View 4 Replies View Related

SQL Server 2012 :: How To Quickly Update / Insert 3M Records In Large Table

Mar 28, 2015

Our system runs a SQL Server 2012 DB, it has a table (table_a) which has over 10M records. Our system have to receive data file from previous system daily which contains approximate 3M updated or new records for table_a. My job is to update table_a with the new data.

The initial solution is:

1 Create a table (table_b) which structur is as the same as table_a

2 Use BCP to import updated records into table_b

3 Remove outdated data in table_a:
delete from table_a inner join table_b on table_a.key_fileds = table_b.key_fields

4 Append updated or new data into table_a:
insert into table_a select * from table_b

As the test result, this solution is very inefficient. Step 3 costs several hours, e.g. How can I improve it?

View 9 Replies View Related

SQL 2012 :: Create Statistics On Tables?

Apr 17, 2014

statistics in sql server. how to create it and update it on tables.?

View 9 Replies View Related

Create Table For Phone Number Formatting?

Jul 15, 2014

I am trying to get my SQL create table to work for my phone number formatting and it is not. When I create the below code, the default is set to 3 numbers only.

CREATE SET TABLE dl_qpt_cqe.contacts, NO FALLBACK ,NO BEFORE JOURNAL,NO AFTER JOURNAL

(contact_id integer not null ,contact varchar(50) , jobtitle varchar(50), dept varchar(50), phone integer format '999-999-9999', phone_ext varchar(10), email varchar(50), constraint pk primary key (contact_id));

Is there some other way I need to format the phone portion so the default is 999-999-9999?

View 1 Replies View Related

Create Table With Veriable Number Of Columns

Apr 12, 2008



Hi everyone,

I need to create temporary table in one of the SP.The problem is that number of columns in table will vary depanding on input in SP.

How can i create table with variable number of columns?

Thanks,


Alex

View 8 Replies View Related

SQL 2012 :: Create View From Multiple Tables That Have 1 To M Relationship

Jul 31, 2014

I have two table studenTtable and courseTable which is each student take more than one course . 1:M...for example Student1 take 2 courses (C1 , C2). Student2 take 3 courses (C1,C2, C3).I need to create a table/View that contain student information from StudentTable plus all the courses and the score for each course from CoursTable in one row.

for example Row1= Student1_Id ,C1_code ,C1_name ,C1_Score ,C2_code,C2_name ,C2_Score Row2= Student2_Id,C1_code, C1_name,C1_Score,C2_code ,C2_name ,C2_Score , C3_code,C3_name,C3_Score

and since Student one just have two courses , I should enter NULL in 'Course 3 fields'.My Struggle is in the insert statement I tried the following but it show an error

Insert Into Newtable ( St_ID, C1_code,c1_name, C1_Score ,C2_code ,C2_name,C2_score,C3_code ,C3_name,C3_score)
Select (Select St_ID from StudentTable) , (Select C_code,c_name,c_Score from Coursetable,SudentTable where course.Stid =Studet.stid) , (Select C_code,c_name,c_Score from course ,student where course.Stid =Studet.stid ), (Select C_code,c_name,c_Score from course ,student where course.Stid =Studet.stid );

I'm fully aware that the New table/View will break the rules of normalization ,but I need it for specifc purpose.I tried also the PIVOT BY functionality but no luck with it .I also tried writing a code using Matlab (because it is high level sw that it is easy to learn for people not expret in programming as me) but didn't know how to combine the Student and Courses Matrices in my loop.

View 5 Replies View Related

Generating Membership Number

Jan 20, 2005

Hi all,
I have a question about generating membership numbers on the fly when someone registers to my website.

Rather than using the auto increment field as a membership number, I would rather keep it as just as the ID for the record and I would like to have a seperate membership number that looks something similar to this...

SR357324J

This will then stay with them for the lifetime of their membership and be on their printed loyalty card.

My questions are...
1) Is there a 'good practice' for membership number format and generation?

2) If this was used as a unique field, is there a degradation in performance when looking up records due to it being alphanumeric.

I may be well off base here, however these are my thoughts so far and your opinion/help is greatly appreciated.

Thanks for your contribution.

View 2 Replies View Related

Generating Sequence Number

Aug 29, 2001

Hello,

I need to know how to generating a sequence number, for example, from 300,000 to 900,000 without skipping any number due to failure. For example, if user 1 request a number then he/she will get 300000 in a transaction. User 2 will get 300001. How ever user 1's transaction fails, then the next request should get 300000. Is it possible to do this in SQL2K? If so, how do I create a table that and stored procedure that can do this.

Thank you so much.

NK

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved