SQL Server Admin 2014 :: Storing Very Large Tables?

Jan 24, 2015

We are in the middle of re-designing few tables (namely transaction tables) that would store very large data and would be hosted on cloud (Azure). The old design of this product breaks transaction tables into monthly tables. i.e. say ORDERS Table would be physically broke into twelve monthly tables over a year like ORDERS0115 (mmyy), ORDERS0215 and so on.

We are in the opinion that keeping the entire transactions in one Table is better. Would like to know what's the best practices for transaction tables like the one mentioned above? Is it better to use one table with partitions. I read somewhere that partitions can slow down SELECT queries if not designed and thought properly.Since this would be hosted on cloud (Azure), do you think some additional things are to be taken care? How a site like Amazon keeps their transactions tables?

View 8 Replies


ADVERTISEMENT

SQL Server Admin 2014 :: Columnstore Index On Large Tables

Jul 1, 2015

I created columnstore index on the table with 20 columns and about 1000 000 000 rows

every day added about 5M rows

"select" queries became faster because of batch mode and table demand less disk space then before

I have also 6 similar tables with 5 000 000 000 rows and plan to move them on columnstore index

server has 128 G RAM

What pitfalls I could face if I will have so many columnstore indexes on one server?

How a could see problems in DMV?

View 3 Replies View Related

SQL Server Admin 2014 :: How To Analyze Large Procedure Cache

Jun 15, 2015

I want to analyze procedure cache, to find inefficient plans and parameter issues.

I do it trow DMV But my requests to DMV are very slow and demand resources because procedure cache is about several GB Actually I dont need on-line analysis.

Is it possible to have fast snapshot of procedure cache?

View 0 Replies View Related

SQL Server Admin 2014 :: Database Mirroring For Large Number Of Databases

Oct 27, 2015

I have a 2 node cluster having 4 cores each wherein having 3 instances of SQL 2008 R2 enterprise comprising of 60 databases, 20 on each instance. I need to setup mirroring for each of the databases to a secondary server having 4 cores and 3 instances. What i understand is that in this case the mirror server will be providing max of 512 worker threads and the 60 mirror databases would consume 240 threads.what all needs to be checked for looking into the feasabilty of going ahead with a async mirror setup as mentioned above.

View 0 Replies View Related

SQL Server Admin 2014 :: MDW Data Collector - Large Number Of SPIDs

Nov 6, 2015

I've installed the MDW (Mangement Data Warehouse) database on our central monitoring SQL Server. I've then added a number of servers to be monitored. The data is collected on the servers that are being monitored and uploaded to the central MDW Monitoring server.

On the servers that are being monitored, I'm seeing a large number (over 1000) of SPIDs being generated by 'SQL Server Data Collector'.

Is this normal behaviour? I've seen more blocking as a result of this.

Is there any way to reduce the number of SPIDs generated?

View 0 Replies View Related

SQL Server Storing Large Amounts Of Data In Multiple Tables

Jul 20, 2005

Hello,Currently we have a database, and it is our desire for it to be ableto store millions of records. The data in the table can be divided upby client, and it stores nothing but about 7 integers.| table || id | clientId | int1 | int2 | int 3 | ... |Right now, our benchmarks indicate a drastic increase in performanceif we divide the data into different tables. For example,table_clientA, table_clientB, table_clientC, despite the fact thetables contain the exact same columns. This however does not seem veryclean or elegant to me, and rather illogical since a database existsas a single file on the harddrive.| table_clientA || id | clientId | int1 | int2 | int 3 | ...| table_clientB || id | clientId | int1 | int2 | int 3 | ...| table_clientC || id | clientId | int1 | int2 | int 3 | ...Is there anyway to duplicate this increase in database performancegained by splitting the table, perhaps by using a certain type ofindex?Thanks,Jeff BrubakerSoftware Developer

View 4 Replies View Related

SQL Server Admin 2014 :: Rebuild CMS Tables From XML File?

Jul 23, 2014

I had to reinstall my local copy of SQL a few weeks ago, which naturally overwrote the

msdb.dbo.sysmanagement_shared_server_groups_internal and
msdb.dbo.sysmanagement_shared_registered_servers_internal tables.

However I still have the local XML file that SSMS reads so I can still access the groups, I just get weird errors when trying to re-register my install as the new CMS. How to rebuilt those tables from the XML file or know of a way to repopulate?

View 3 Replies View Related

SQL Server Admin 2014 :: Partitioning Master And 4 Child Tables

Jul 5, 2014

I have 6 tables which are very huge in row count and need to be partitioned for better manageability.

Little info: Every day, 300 Million records are inserted and 300 million records are deleted in below 7 tables. we maintain only 8 days worth of data in below tables which is the reason records which are older than 8 days are continuously deleted.

Master table which has [ID],[Timestamp]
Table Name: Sample - 2,578,106

Child tables: Foreign key [ID] is common for all the tables. There is no timestamp column in child table.
dbo.ConnectionDB - 1,147,578,048
dbo.ConnectionSS - 876,458,321
dbo.ConnectionRT - 118,133,857
dbo.ConnectionSample - 100,038,535
dbo.Command - 100,032,235

I would like to partition the above child tables based on the IDs that are inserted every 4 hours. Meaning, All IDs that are inserted in 4 hours window should be in a partition.

View 1 Replies View Related

SQL Server Admin 2014 :: In-Memory Processing - Existing Tables

Nov 11, 2014

Is there a method of forcing existing tables into the in-memory filegroup so the table data can benefit from in-memory processing.

View 7 Replies View Related

SQL Server Admin 2014 :: Update Statistics On Frequently Updated Tables

Dec 23, 2014

I'm working on databases where statistics of some indexes (tables) are changing too frequently. Once I update them manually, one minute after they get 10-20% change, and five minutes after they get over 100% change. Tables get updated very frequently (multiple times in a second).

When I run a query to read from sys.stats, sys.dm_db_stats_properties and other dynamic views, I see that they were last updated when I did it manually, but the change rate overpassed the 500+20% (tables have multiples of 10K rows). Auto create and update statistics are set to true on all databases, and I don't know why sql server does not do that automatically.

View 2 Replies View Related

SQL Server Admin 2014 :: Import Tables From Azure To Premise Database

Oct 26, 2015

We would like to import some tables from Microsoft azure data mart to our local database on the server of our premise for some purpose.

View 4 Replies View Related

SQL Server Admin 2014 :: SSMS - Disable Check For Memory Optimized Tables?

Oct 2, 2014

I have the following setup:

- An MSSQL 2014 Standard server that houses multiple small databases (in excess of a hundred).
- These databases are frequently dropped and restored by an application that uses this SQL Server.
- There is a business need for this setup at this time, so I can't get away from it. Therefore answers like "don't have so many small databases that are frequently dropped and restored" would be somewhat unuseful

This is the problem I have:

- When I connect SSMS 2014 to the server and expand the "Databases" node, it takes forever to display. In comparison, SSMS 2008 connected to SQL 2008R2 server with the same number of databases displays the Databases tree very quickly.

I ran a trace to see what exactly SSMS 2014 is doing. When the "Databases" node is expanded, it runs a query that checks each database for Memory-Optimized Tables (new and wonderful feature of SQL 2014 for sure, but I'm not using it, at least yet). Naturally, when you have to loop through over a hundred DBs, it takes time. Worse yet, if one of these DBs is in process of being restored, the query sits and waits to time out before proceeding to the next DB. Sometimes this causes outright timeouts. Here is the query:

use [MyDatabase]
SELECT
ISNULL((select top 1 1 from sys.filegroups FG where FG.[type] = 'FX'), 0) AS [HasMemoryOptimizedObjects]

To be sure, this is NOT a SQL Server performance issue. This server processes a rather heavy workload and has been doing so for over a month, and the workload completes within expected time limits or better. Even so I've done some basic performance measuring, and the server itself is quite all right.

Moreover, if I connect SSMS 2008 to it, I get an error message (Index out of bounds or somesuch), but SSMS 2008 does connect, and displays the Databases tree much faster than SSMS 2014.

I'd like to turn off the option to check for Memory Optimized Objects altogether, as I'm not using the feature.

View 3 Replies View Related

SQL Server Admin 2014 :: Replication - Subscription Database Created But Tables Not Populated

Nov 6, 2015

As per attachment, i have been created replications but in local subscription it is not populated any thing at the same time, Subscription database has been created but tables is not populated as per publication table.

View 2 Replies View Related

SQL Server Admin 2014 :: Does Security-admin Role Plus Deny Alter Any Login Cancel Each Other Out

Aug 27, 2015

I want to set up a database role so that users can use sp_readerrorlog through SSMS. It does a check on membership in the securityadmin role.

I have tested it and can see you can grant execute on xp_readerrorlog but the SSMS GUI uses sp_readerrorlog.

I thought I could create a user/certificate and add the signature to sp_readerrorlog but it's not permitted (likely because it's not a normal database object).

So the other solution is to add the users to the securityadmin role but then explicitly deny alter any login (best done with a custom server role in 2012+ but otherwise just manually in 2008). I tested this out and it works, I'm not able to alter any logins or increase my own permissions, I also did a check of what's reported from fn_my_permissions(null, null) and it shows minimal permissions like I'd expect.

View 0 Replies View Related

Storing Large Files In The SQL Server 2005

Feb 9, 2006

I have a table that I'm inserting a file into and using the Image data type to store the binary object.  Now the code below works fine for files around 1.5 MB, but anything larger and it's like the code won't even execute and I get a Page Not found error.
I'm in the process of running some traces to find out what's going on in the backend, but I'm assuming there's something amiss with my code.  The Image data type should handle files that size with no problem but for some reason it isn't.
Does anyone see anything wrong?
Thanks
Dim iLength As Integer = CType(File1.PostedFile.InputStream.Length, Integer)
If iLength = 0 Then Exit Sub 'not a valid file
Dim sContentType As String = File1.PostedFile.ContentType
Dim sFileName As String, i As Integer
Dim bytContent As Byte()
ReDim bytContent(iLength) 'byte array, set to file size

'strip the path off the filename
i = InStrRev(File1.PostedFile.FileName.Trim, "")
If i = 0 Then
sFileName = File1.PostedFile.FileName.Trim
Else
sFileName = Right(File1.PostedFile.FileName.Trim, Len(File1.PostedFile.FileName.Trim) - i)
End If
conn = New SqlConnection(eco)
conn.Open()
cmd = New SqlCommand("INSERT INTO ECO_Attachments (ECOID, FromType, DocName,OldRev,NewRev,NtLogin,DisplayName, FileName, FileSize, FileData, ContentType) VALUES (@ECOID, @FromType,@DocName,@OldRev,@NewRev,@NtLogin,@DisplayName, @FileName, @FileSize, @FileData, @ContentType) ")
cmd.Connection = conn
Try
File1.PostedFile.InputStream.Read(bytContent, 0, iLength)
With cmd
.Parameters.Add("@ECOID", SqlDbType.Int)
.Parameters.Add("@FromType", SqlDbType.NVarChar, 50)
.Parameters.Add("@DocName", SqlDbType.NVarChar, 250)
.Parameters.Add("@OldRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NewRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NTLogin", SqlDbType.NVarChar, 100)
.Parameters.Add("@DisplayName", SqlDbType.NVarChar, 200)
.Parameters.Add("@FileName", SqlDbType.NVarChar, 255)
.Parameters.Add("@FileSize", SqlDbType.Real)
.Parameters.Add("@FileData", SqlDbType.Image)
.Parameters.Add("@ContentType", SqlDbType.NVarChar, 50)
.Parameters("@ECOID").Value = ECOID
.Parameters("@FromType").Value = From
.Parameters("@DocName").Value = DocName
.Parameters("@OldRev").Value = OldRev
.Parameters("@NewRev").Value = NewRev
.Parameters("@NTLogin").Value = NTLogon
.Parameters("@DisplayName").Value = DisplayName
.Parameters("@FileName").Value = sFileName
.Parameters("@FileSize").Value = iLength
.Parameters("@FileData").Value = bytContent
.Parameters("@ContentType").Value = sContentType
.ExecuteNonQuery()
'.ExecuteScalar()
End With
Catch ex As Exception
Response.Write(ex)
'Handle your database error here
conn.Close()
End Try

View 1 Replies View Related

SQL Server Admin 2014 :: Encryption Key Not Known

Apr 14, 2015

I inherited a lot of Servers to upgrade to 2014 to include an SSRS Server.

The encryption Key was never backed up and it seems that no one knows what the password is?

Do I have to manually load the reports? There are a lot of Reports.

[URL]

View 4 Replies View Related

SQL Server Admin 2014 :: BCP Aborts On First FK Violation

Sep 26, 2013

I want to use BCP to load data from a text file.

By default, constraints are turned off in bcp, so I use the CHECK_CONSTRAINTS hint.

bcp aborts if ANY of the rows contains a FK violation. No data get loaded.

So if I add the -b 1 batch size option, it loads all data UNTIL the first FK violation, but nothing after that.

I want to load EVERYTHING ... except for the violations. But bcp won't let me. Is there a way?

View 2 Replies View Related

SQL Server Admin 2014 :: What Is The Default SA Password

Jan 13, 2014

If I install an instance with Windows Only authentication, and then change it to Mixed Mode, if I enable the sa login, the password has already been set. What is the default? If it's generated, how secure is it? Is the password generated? What algorithm is used for that?

View 9 Replies View Related

SQL Server Admin 2014 :: How To Restore MDF File

Mar 21, 2014

My sql databases in SQL Server 2014 has the status "suspend" as I saw in SQL Management Studio. I can't restore to serviceable condition sql databases through standard procedures. I need to restore .mdf file.

View 9 Replies View Related

SQL Server Admin 2014 :: Return One Field / One Row

Jun 18, 2014

I am using a monitoring system where I can monitor a numeric SQL result assuming the result is one field and one row.I would like to do this to say monitor the free available space or percentage on say the Master database. DBCC SQLPERF gives me a few columns and results for all databases on the server.

View 2 Replies View Related

SQL Server Admin 2014 :: DNS Pointing To A Listener

Jun 25, 2014

In our environment applications are using a DNS name which points to the physical server ip address. Now we are planning to move to 2014. We are planning to have servers in different subnets so we will be having two ip adresses for listener. How we can point the DNS to the listener ips? If failover happens can the DNS point to the exact ip address of the listener where it's primary node?

View 1 Replies View Related

SQL Server Admin 2014 :: How To Schedule A Job To Run At Different Intervals

Jul 31, 2014

Is there a way to schedule a sql job to run at different intervals

For eg:
The job should run at
7:00 Am
8:00 AM
and then at 10:00 Am

View 3 Replies View Related

SQL Server Admin 2014 :: Non-yielding On Scheduler

Nov 19, 2014

We installed the Sql 2014 in Test server.

We observed frequently

"Process 0:0:0 (0x1e10) Worker 0x00000006B6D341A0 appears to be non-yielding on Scheduler 13. Thread creation time: 12906028806348. Approx Thread CPU Used: kernel 0 ms, user 0 ms. Process Utilization 13%. System Idle 84%. Interval: 70189 ms."

Is it better to run the profiler or performan counter?

What are the filters we have to select in the profiler to monitor the Sql server

View 0 Replies View Related

SQL Server Admin 2014 :: Reporting Services Through IIS?

Dec 8, 2014

I have a SQL server box running 2014 reporting services. I have another server running IIS v8.

I would like to be able to connect to the IIS site and be given the SSRS report browser.

So externally if I browse to [URL], I am presented with the report server interface, the same as if I browse to http://xxx.xxx.xxx.xxx/reports internally.

What are my options?

View 4 Replies View Related

SQL Server Admin 2014 :: DB Read Only Copy

Dec 11, 2014

What is the best approach for a read only copy of a database that is ~ 1TB. The primary database is fed nightly with an ETL process. We are currently trying to duplicate the ETL to read only server but that process is not going well. So we are looking at other options to let SQL make the copy.

The primary database is on a Win12R2 with SQL 12 or 14, a 2 node A/P failover cluster.

The read only copy will be on a Win12R2 with SQL 12 or 14. It is not a requirement to fail over to the read only copy if the primary should go down.

What would best the approach to accomplish the end result?

View 3 Replies View Related

SQL Server Admin 2014 :: Possible Errors In AlwaysOn

Jan 12, 2015

I did not worked on AVG in sql 2012, what are the possible errors and how to resolve in always on in sql 2012 .

View 1 Replies View Related

SQL Server Admin 2014 :: Script To Failover

Jan 18, 2015

I have 10 databases which are configured as principal in mirroring I need to failover all the databases as part of failover , instead of writing query each database as parner failover, is an script which will generate the databases as principal to failover ?

View 3 Replies View Related

SQL Server Admin 2014 :: Delay In Between Replicas In AG

Feb 18, 2015

How to check the delay in between replicas in AG?

View 3 Replies View Related

SQL Server Admin 2014 :: OST File Is Not Accessible

Feb 26, 2015

my 4 GB .OST file is corrupted and i am not able to access my mails and contacts.

View 2 Replies View Related

SQL Server Admin 2014 :: How To Identify The Job With The Code

Apr 7, 2015

I got query from sysproccesses and the result for one spid is : SQLAgent - TSQL JobStep (Job 0xF3BDA6DFF9DCF94D81FF97C49EDA7C61 : Step 1

Which job has this code : 0xF3BDA6DFF9DCF94D81FF97C49EDA7C61

View 1 Replies View Related

SQL Server Admin 2014 :: SSMS Will Only Run As Administrator

May 28, 2015

After installing SSMS on some computers - the only way we can get SSMS to run correctly is to run it as the administrator. Is there a way where you don't have to do that? These end users are logging as themselves and have accounts in SQL Server all set up - but SSMS will only launch for them if we right click and select "run as administrator".After doing some digging - it seems that this is a common problem out there.

View 3 Replies View Related

SQL Server Admin 2014 :: Old Backups Will Not Delete

Jun 19, 2015

Have a SQL 2014 install and cannot for the life of me get the maintenance plan to remove old backups. I've tried everything. Rights to the folder where the backups are stored are adequate, extension set in the clean up task is as it should be, etc. Log shows the job ran successfully. Running the command manually shows successful completion, but backups are still not removed.

View 9 Replies View Related

SQL Server Admin 2014 :: What Does Restore Log Do After The Log Is Restored

Jul 29, 2015

when execute the restore log command, in the messages window it shows how many seconds the restore takes, at the meantime, on the status bar, it also shows the seconds the command takes.

Two values are different and could be very different, please see below examples , restoring takes 1.8 seconds, but in total the command takes 4 seconds to complete, the other one is 8.1 seconds and 12 seconds.

What does SQL Server or Windows do after the restoring?

pic a:

pic b:

I did a xperf, I can see after the restoring is completed, sql server did garbage collect and log write, which just run very quickly, but storage is busy on reading the log file for nearly 2.2 seconds( 4-1.8), and 4 seconds ( 12-8.1) .

pic 1:

pic 2:

see pic 1 above, from 13 to 17, the restore operation is finished, but the storage jump to 100% active to do some reads, only reads no writes. zoom that period shows pic 2, it read 4096 (I don't know the unit size) for about 4 seconds, what does this do?

Data file, log file, backup file are no different drives, but all local drive, the interesting point is the read jumped after restoring, I tested it on different server, same result...

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved