Paging Large Results In SQL 2005

May 29, 2006

lets say we have more than 100 000 rows in Table1, and we want to view each 10 rows alone.... and by pressing on a NEXT button we will see the other 10 pages....

there is 2 buttons : NEXT and PREVIOUS

so can anyone tell me how to do that in SQL 2005, and what is correctly called.

I have found a code that does use ROW_NUMBER in order to view results between 2 numbers,

example: rows between 10 and 50....
but It is not what I want, so please I need some help, thank you

By Uncle Sam

View 10 Replies


ADVERTISEMENT

Paging Large Results In SQL 2005

May 29, 2006

lets say we have more than 100 000 rows in Table1, and we want to view each 10 rows alone.... and by pressing on a NEXT button we will see the other 10 pages....

there is 2 buttons : NEXT and PREVIOUS

so can anyone tell me how to do that in SQL 2005, and what is correctly called.

I have found a code that does use ROW_NUMBER in order to view results between 2 numbers,

example: rows between 10 and 50....
but It is not what I want, so please I need some help, thank you

By Uncle Sam

View 4 Replies View Related

SQL 2005 Full-Text Performance On Large Results

May 10, 2006

Hello everybody,
I've got a little problem wich i'm trying to solve since 1-2 years and i hoped it would go away with SQL 2005 - but that wasn't the case :(.

Situation:
I've just bought a new Server containing:
SQL 2005
64 Bit Enviroment
4 GB RAM
2x AMD Opteron 2 GHz Prozeccors (Dual Core)
2x RAID Controllers (RAID 1) containing
1.1 System
1.2 Data
2.1 Transaction Logs

I've created a full-text table containing all the search terms i need to search.
Table build:
RecID - int - Primary Key
SrcID - varchar(30)
ArticleID - int - referring to an original table
SearchField - varchar(150) - Containing the search terms
timestamp - timestamp field

Fulltext index:
RecID as Primary Key
SearchField as indexed field - Wordbreaker: Neutral (containing several languages), Accent sensitivity off

Now i've got different tables imported in here resulting in a table size of ~ 13 million rows.

There is no problem with the performance on this catalog if i search a term wich isn't contained in more than 200-300 recordsets - but if i search for a term wich could occur in 200'000 upwards it gets extremely slow.

On the slow query the first records get in after no time, but until the query finished up to 60 seconds pass.
The problem is that i have to sort by a ranking value wich is stored externally - so i need all results to sort them...

current (debugging) query:
SELECT ArticleID FROM fullTextTable AS ft INNER JOIN CONTAINSTABLE(FullTextCatalog,SearchField,'"term*"') AS ftRes ON ftRes.[KEY]=ft.idEntry

Now if i check in the performance monitor:
As soon as i run the query the 'Avg. Disk Read Queue Length' counter on disk D (SQL Data Files) jumps to the top, until the query has finished.
Almost no read/write activity on C: where the Fulltext is stored...

If i rerun the query, after it finished once successfully - it takes place below 1-2 seconds, would be nice to get that result in first place :).

Does anybody know a workaround to this problem?

View 9 Replies View Related

Paging Large Recordsets

Oct 21, 2007

Here's a question you'll never quit hearing: is there a convenient way to page through large recordsets in SQL Server 2000?

I'm writing some software which, for all intents and purposes, works like a messageboard: users can create threads, leave replies, and so on.

I have about a half million records in a few of my tables, and some of my queries return 1000s of results. I'd prefer not to return 1000s of records all at once, so I don't want to page my records in code; I'd rather page them in SQL Server. Naturally, I want to page replies. However, I don't know of a convenient way to page records in SQL Server 2000.

View 1 Replies View Related

Handle Paging With Large Datasets

Apr 10, 2008

Im looking at this article http://www.dotnetjunkies.com/Article/EA868776-D71E-448A-BC23-B64B871F967F.dcik
and it seems like they are selecting the entire customers table into the temp table, correct ?

View 2 Replies View Related

Sorting + Paging A Large Table In Stored Procedure

May 6, 2007

As I said above, how do I put sorting + paging in a stored procedure.My database has approximately 50000 records, and obviously I can't SELECT all of them and let GridView / DataView do the work, right? Or else it would use to much resources per one request.So I intend to use sorting + paging at the database level. It's going to be either hardcode SQL or stored procedures.If it's hardcode SQL, I can just change SQL statement each time the parameters (startRecord, maxRecords, sortColumns) change.But I don't know what to do in stored procedure to get the same result. I know how to implement paging in stored procedure (ROW_NUMBER) but I don't know how to change ORDER BY clause at runtime in the stored procedure.Thanks in advance.PS. In case "ask_Scotty", who replied in my previous post,   http://forums.asp.net/thread/1696818.aspx, is reading this, please look at my reply on your answer in the last post. Thank you.

View 3 Replies View Related

SQL Server 2008 :: How To Find Statements That Cause Large Memory Paging

Apr 22, 2015

I am monitoring our production server, and noticed that periodically we have spikes of Memory Paging Rate (pages/sec).

How to find particular queries/stored procedures that causing this?

View 5 Replies View Related

SQL Server 2008 :: How To Find Which Queries / Processes Causing Large Memory Paging Rate

Mar 30, 2015

Our monitoring tool shows that our production system periodically experiencing large rate - up to 800 memory pages/sec. How to find out which particular queries, S.P., processes that initiate this?

View 3 Replies View Related

Limiting Large Query Results Sets

May 22, 2000

We are trying to limit are query that returns items from our database. The
query currently returns 32,000 records. We are trying to figure out an effecient way so we can request the 1st 50, or the 3rd 50, or the 5th 50 to display to the screen. We dont want to return the entire 32,000 then limit whats displayed to the screen in ADO. We want the select statment to only return 50 at a time. Any suggestions?

View 1 Replies View Related

Inconsistent Performance Results With Large Partitioned Tables.

Dec 5, 2007

I have a query that joins two large partitioned tables and depending on the values in the where clause, I can get dramatically different performance results.

The first query completed in around 7s and has 47,000 logical reads.


select mo.monitor_id,

mo.site_id,

mo.testtime,

sum(mo.NumBytes),

sum(mo.DNSTime),

sum(mo.ConnectTime),

sum(mo.FirstByteTime),

sum(mo.ContentTime),

sum(mo.RelocTime)

from monitor_raw mr(nolock), monitor_object mo(nolock)

where mr.monitor_id in (5339, 5341, 5342, 943842, 943866)

and mr.testtime between 'Oct 31 2007 3:00:00:000PM' and 'Nov 30 2007 3:00:00:000PM'

and mo.returncode = 200

and mr.site_id in (101,102,105,109,110,112,115,117,119,122,126,151,132,139,129,135,121,138,143,142,159,148,128,171,176,177,178,111,113,116,118,120,127,133,131,130,174,179,185,205,200,202,203,204,210,211,208,209,212,213,216,199,214,224,225,229,230,232,235,241,245,247,250,254,261,267,264,265,266,268,269)

and mr.escalationlevel = 0

and mr.monitor_id = mo.monitor_id

and mr.testtime = mo.testtime

and mr.site_id = mo.site_id group by mo.monitor_id, mo.site_id, mo.testtime


The second query takes 188s to complete and has 1.8m logical reads. The only difference between the two is the value of the monitor_ids in the where clause.


select mo.monitor_id,

mo.site_id,

mo.testtime,

sum(mo.NumBytes),

sum(mo.DNSTime),

sum(mo.ConnectTime),

sum(mo.FirstByteTime),

sum(mo.ContentTime),

sum(mo.RelocTime)

from monitor_raw mr(nolock), monitor_object mo(nolock)

where mr.monitor_id in (152682, 5339, 5341, 5342, 268080)

and mr.testtime between 'Oct 31 2007 3:00:00:000PM' and 'Nov 30 2007 3:00:00:000PM'

and mo.returncode = 200

and mr.site_id in (101,102,105,109,110,112,115,117,119,122,126,151,132,139,129,135,121,138,143,142,159,148,128,171,176,177,178,111,113,116,118,120,127,133,131,130,174,179,185,205,200,202,203,204,210,211,208,209,212,213,216,199,214,224,225,229,230,232,235,241,245,247,250,254,261,267,264,265,266,268,269)

and mr.escalationlevel = 0

and mr.monitor_id = mo.monitor_id

and mr.testtime = mo.testtime

and mr.site_id = mo.site_id group by mo.monitor_id, mo.site_id, mo.testtime



The two tables have clustered indexes on monitor_id, testtime and site_id. Comparing the execution plan, I can see why there is such a difference in performance. The second query performs a clustered index seek on the monitor_object table starting at the lowest monitor_id, testtime & site_id through the highest monitor_id, testtime & site_id. The first query performs a clustered index seek where the monitor_id, testtime and site_id equals the same values from the monitor_raw table.


My question is, how can I force the second query to use the same execution plan as the first so that I can get better performance?

One possible workaround that I could use is to execute five individual queries, one for each monitor_id and then union the results together but this would require significant code changes to my stored procs.

Thanks,

Tim

View 5 Replies View Related

SQL 2005 Paging Using RowNumber()

Apr 22, 2008

I got problem with using custom paging in sql 2005
SET ANSI_NULLS OFF
GO
SET QUOTED_IDENTIFIER OFF
GO
ALTER PROCEDURE [dbo].[searchperson_view_general]
@Search nvarchar(2000)
,@OrderBy nvarchar (2000)
,@PageSize int
,@PageIndex int
AS
DECLARE @PageLowerBound int
DECLARE @PageUpperBound int

SET @PageLowerBound = @PageSize * @PageIndex
SET @PageUpperBound = @PageSize - 1 + @PageLowerBound

--Default order by to first column
IF (@OrderBy is null or LEN(@OrderBy) < 1)
BEGIN
SET @OrderBy = 'p.[person_id]'
END

-- SQL Server 2005 Paging
declare @SQL as nvarchar(4000)
SET @SQL = 'WITH PageIndex AS ('
SET @SQL = @SQL + ' SELECT distinct'
IF @PageSize > 0
BEGIN
SET @SQL = @SQL + ' TOP ' + convert(nvarchar, @PageUpperBound)
END

SET @SQL = @SQL + ' ROW_NUMBER() OVER (ORDER BY ' + @OrderBy + ') as RowIndex '
SET @SQL = @SQL + ', p.[person_id]'
SET @SQL = @SQL + ', p.[userType_id]'
SET @SQL = @SQL + ', p.[fullName]'
SET @SQL = @SQL + ', p.[gender_nm]'
SET @SQL = @SQL + ', p.[dateOfBirth] '
SET @SQL = @SQL + ', p.[positionTitle]'
SET @SQL = @SQL + ' FROM dbo.[person_view] p '

IF LEN(@Search) > 0
BEGIN
SET @SQL = @SQL + @Search
END
SET @SQL = @SQL + ' ) SELECT distinct'
SET @SQL = @SQL + ' p.person_id'
SET @SQL = @SQL + ', p.userType_id'
SET @SQL = @SQL + ', p.fullName'
SET @SQL = @SQL + ', p.gender_nm'
SET @SQL = @SQL + ', (year(getdate()) - year(p.[dateOfBirth])) as [dateOfBirth] '
SET @SQL = @SQL + ', p.positionTitle'
SET @SQL = @SQL + ' FROM PageIndex p '
SET @SQL = @SQL + ' WHERE RowIndex > ' + convert(nvarchar, @PageLowerBound)

IF @PageSize > 0
BEGIN
SET @SQL = @SQL + ' AND RowIndex <= ' + convert(nvarchar, @PageUpperBound)
END

SET @SQL = @SQL + ' ORDER BY ' + @OrderBy
exec sp_executesql @SQL

I checked my store procedure with parameters
exec [hr2b_searchperson_view_general_load]
'LEFT OUTER JOIN qualification
ON p.person_id = qualification.person_id
WHERE qualification.institutionName like N''%ABC%'''
,' p.person_id asc ', 25 , 1

This is a actual query show :

WITH PageIndex AS
( SELECT distinct TOP 49 ROW_NUMBER() OVER
(ORDER BY p.person_id asc )
as RowIndex
, p.[person_id]
, p.[userType_id]
, p.[fullName]
, p.[gender_nm]
, p.[dateOfBirth]
, p.[positionTitle]
FROM person_view p
LEFT OUTER JOIN qualification
ON p.person_id = qualification.person_id
WHERE qualification.institutionName like N'%ABC%' )
SELECT distinct
p.person_id
, p.userType_id
, p.fullName
, p.gender_nm
, (year(getdate()) - year(p.[dateOfBirth])) as [dateOfBirth]
, p.positionTitle
FROM PageIndex p
WHERE RowIndex > 25 AND RowIndex <= 49 ORDER BY p.person_id asc

If I used this query without using DISTINCT it will return extactly number of records which I expected but It is duplicated.
Then I tried to use DISTINCT in this query number of records return is less than 25 records . Because it was duplicated records when I used LEFT OUTER JOIN.But my query will be able to use more LEFT OUTER JOIN than this query. Please help me get exactly 25 records?

This is my tables
person_view(person_id, fullname, userType_id, gender_nm, dateOfBirth, positionTitle)

Qualification(qualification_id, qualification_nm,institutionName, person_id)

Thanks in advance.

View 3 Replies View Related

Paging In Ssrs 2005

Feb 13, 2008

Hi

Can any one tell me how to enable paging in ssrs 2005 reports, i am using "table" control for the report and when we print the report it gives every thing good, but when we desplay it the web it does not show page by page.

please help me...


thank you.............

View 5 Replies View Related

Sql 2005 Paging By Column In Procedure

Apr 23, 2008

Hi,

I've got some procedure which pages select query, the example is below:



Code Snippet
CREATEEND PROC GetCustomersByPage

@PageSize int, @PageNumber int

AS

Declare @RowStart int
Declare @RowEnd int

if @PageNumber > 0
Begin

SET @PageNumber = @PageNumber -1

SET @RowStart = @PageSize * @PageNumber + 1;
SET @RowEnd = @RowStart + @PageSize - 1 ;

With Cust AS
( SELECT CustomerID, CompanyName, CompanyAddress,
ROW_NUMBER() OVER (order by CompanyName) as RowNumber
FROM Customers )

select *
from Cust
Where RowNumber >= @RowStart and RowNumber <= @RowEnd end

How can I change this procedure in order to page the query OVER the column set as an argument?
In other words I would like to execute proc like:
- exec GetCustomersByPage 10, 1, 'CompanyName' which pages by CompanyName(...OVER (order by CompanyName)...)
- exec GetCustomersByPage 10, 1, 'CompanyAddress' which pages by ComanyAddress

Is it possible?

View 8 Replies View Related

Questions On Use Of SQL Server 2005 Functionality In Gridview Paging

Jun 25, 2007

I have a webpage that displays 4000 or more records in a GridView control powered by a SqlDataSource.  It's very slow.  I'm reading the following article on custom paging: http://aspnet.4guysfromrolla.com/articles/031506-1.aspx.  This article uses an ObjectDataSource, and some functionality new to Sql Server 2005 to implement custom paging.There is a stored procedure called GetEmployeesSubestByDepartmentIDSorted that looks like this:ALTER PROCEDURE dbo.GetEmployeesSubsetByDepartmentIDSorted(    @DepartmentID        int,    @sortExpression        nvarchar(50),    @startRowIndex        int,    @maximumRows        int)AS    IF @DepartmentID IS NULL        -- If @DepartmentID is null, then we want to get all employees        EXEC dbo.GetEmployeesSubsetSorted @sortExpression, @startRowIndex, @maximumRows    ELSE      BEGIN        -- Otherwise we want to get just those employees in the specified department        IF LEN(@sortExpression) = 0            SET @sortExpression = 'EmployeeID'        -- Since @startRowIndex is zero-based in the data Web control, but one-based w/ROW_NUMBER(), increment        SET @startRowIndex = @startRowIndex + 1        -- Issue query        DECLARE @sql nvarchar(4000)        SET @sql = 'SELECT EmployeeID, LastName, FirstName, DepartmentID, Salary,                     HireDate, DepartmentName        FROM            (SELECT EmployeeID, LastName, FirstName, e.DepartmentID, Salary,                     HireDate, d.Name as DepartmentName,                     ROW_NUMBER() OVER(ORDER BY ' + @sortExpression + ') as RowNum             FROM Employees e                INNER JOIN Departments d ON                    e.DepartmentID = d.DepartmentID             WHERE e.DepartmentID = ' + CONVERT(nvarchar(10), @DepartmentID) + '            ) as EmpInfo        WHERE RowNum BETWEEN ' + CONVERT(nvarchar(10), @startRowIndex) +                         ' AND (' + CONVERT(nvarchar(10), @startRowIndex) + ' + '                         + CONVERT(nvarchar(10), @maximumRows) + ') - 1'                -- Execute the SQL query        EXEC sp_executesql @sql      ENDThe part that's bold is the part I don't understand.  Can someone shed some light on this for me?  What is this doing and why?Diane 

View 4 Replies View Related

Paging: SQL Syntax For Acess Versus SQL Server 2005?

Feb 7, 2008

Hi,
I'm using ComponentArt's Callback grids with Manual Paging.

The CA example grid uses Access:(http://www.componentart.com/webui/demos/demos_control-specific/grid/programming/manual_paging/WebForm1.aspx)

That SQL syntax produced is invalid in SQL Server 2005.

Example:
"SELECT TOP " & Grid1.PageSize & " * FROM (SELECT TOP " & ((Grid1.CurrentPageIndex + 1) * Grid1.PageSize) & " * FROM Posts ORDER BY " & sSortColumn & " " & sSortOrderRev & ", " & sKeyColumn & " " & sSortOrderRev & ") ORDER BY " & sSortColumn & " " & sSortOrder & ", " & sKeyColumn & " " & sSortOrder

So...This is what I have (simplified), and it appears return incorrect rows on the last few pages:
SELECT top 15 * FROM Posts where & sFilterString & " and Postid in (SELECT TOP " & ((Grid1.CurrentPageIndex + 1) * Grid1.PageSize) & " Postid FROM Posts where " & sFilterString & " ORDER BY " & sSortColumn & " " & sSortOrder & ") " & " ORDER BY " & sSortColumn & " " & sSortOrderRev


What other approaches has anyone used besides the "ID in (...)"?The examples I have included show the available variables: sort asc and desc, current page, number of rows on a page, etc.

View 2 Replies View Related

Better Method To Count Records In Custom Paging For SQL Server 2005

Jul 24, 2006

heres my problem, since I migrated to SQL-Server 2005, I was able to use the Row_Number() Over Method to make my Custom Paging Stored Procedure better.  But theres onte thing that is still bothering me, and its the fact the Im still using and old and classic Count instruction to find my total of Rows, which slow down a little my Stored Procedure.  What I want to know is:  Is there a way to use something more efficiant to count theBig Total of Rows without using the Count instruction???  heres my stored procedure:
SELECT RowNum, morerecords, Ad_Id FROM (Select ROW_NUMBER() OVER (ORDER BY Ad_Id) AS RowNum, morerecords = (Select Count(Ad_Id) From Ads) FROM Ads)  as testWHERE RowNum Between 11 AND 20
The green part is the problem, the fields morerecords is the one Im using to count all my records, but its a waste of performance to use that in a custom paging method (since it will check every records, normally, theres a ton of condition with a lot of inner join, but I simplified things in my exemple)...I hope I was clear enough in my explication, and that someone will be able to help me.  Thank for your time.
  

View 1 Replies View Related

Does It Store All The Results To Tempdb Database When I Query Against A Large Table Which Joins Another Table?

Jun 25, 2007

Hi, all experts here,



I am wondering if tempdb stores all results tempararily whenever I query a large fact table with over 4 million records which joins another dimension table? Since each time when I run the query, the tempdb grows to nearly 1GB which nearly runs out all the space on my local system drive, as a result the performance totally down. Is there any way to fix this problem? Thanks a lot in advance and I am looking forward to hearing from you shortly for your kind advices.



With best regards,



Yours sincerely,



View 11 Replies View Related

Large Number Of Databases On 2005

Feb 23, 2007

For anyone with a larger number of databases (500+): How many do you have in a single instance. If you are using multiple instance on a single server, how many dbs per instance. This is why I'm asking

We are experiencing 701 "out of system memory" and temporary (usually) system freezes when the error occurs. We have 32bit 2005 version 9.00.2153.00, 32GB of memory, AWE enabled, quad dual-core 3GHz hyperthreaded server. Nether the bPool or VAS show any pressure when the "out of system memory error" occurs. Since this error usually indicates a VAS problem we tried increasing VAS to 1GB w/the -g flag. It made no difference. PSS has been working on the case for 3 weeks. They dont seem to be finding any evidince of memory pressure either. When I last spole to the escalation engineer yesterday it seemed that they are going to recommend reducing the number of databases on the server. I asked for clarification as to whether we are hitting a 32 bit barrior, an instance limitation, or both. I am awaiting the answer. How many databases do you have on your server? We had between 1700 and 1900 (the number varies) at times when the error occured. We are now at 1500, and have not had the error in the 2 days since reducing the number of databases...

View 4 Replies View Related

Solutions To Large Access In SQL 2005

Dec 17, 2007

I am on a project to develope an route finding system that search for the optimal route to stick with for users of the system. The current version that i've done and successfully run is using normal database access in MS SQL 2005. I stored nodes information in the database and the application will query them using normal "select" clauses and return a datatable object to the application. The result is rather slow cause by the multiple access to database server to query. The application used 8 second to look for a short route withour cosidering lots of calculation of traffic information that i will use later. Any comments on the architecture or approach to switch my algo to T-SQl?

View 5 Replies View Related

Storing Large Files In The SQL Server 2005

Feb 9, 2006

I have a table that I'm inserting a file into and using the Image data type to store the binary object.  Now the code below works fine for files around 1.5 MB, but anything larger and it's like the code won't even execute and I get a Page Not found error.
I'm in the process of running some traces to find out what's going on in the backend, but I'm assuming there's something amiss with my code.  The Image data type should handle files that size with no problem but for some reason it isn't.
Does anyone see anything wrong?
Thanks
Dim iLength As Integer = CType(File1.PostedFile.InputStream.Length, Integer)
If iLength = 0 Then Exit Sub 'not a valid file
Dim sContentType As String = File1.PostedFile.ContentType
Dim sFileName As String, i As Integer
Dim bytContent As Byte()
ReDim bytContent(iLength) 'byte array, set to file size

'strip the path off the filename
i = InStrRev(File1.PostedFile.FileName.Trim, "")
If i = 0 Then
sFileName = File1.PostedFile.FileName.Trim
Else
sFileName = Right(File1.PostedFile.FileName.Trim, Len(File1.PostedFile.FileName.Trim) - i)
End If
conn = New SqlConnection(eco)
conn.Open()
cmd = New SqlCommand("INSERT INTO ECO_Attachments (ECOID, FromType, DocName,OldRev,NewRev,NtLogin,DisplayName, FileName, FileSize, FileData, ContentType) VALUES (@ECOID, @FromType,@DocName,@OldRev,@NewRev,@NtLogin,@DisplayName, @FileName, @FileSize, @FileData, @ContentType) ")
cmd.Connection = conn
Try
File1.PostedFile.InputStream.Read(bytContent, 0, iLength)
With cmd
.Parameters.Add("@ECOID", SqlDbType.Int)
.Parameters.Add("@FromType", SqlDbType.NVarChar, 50)
.Parameters.Add("@DocName", SqlDbType.NVarChar, 250)
.Parameters.Add("@OldRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NewRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NTLogin", SqlDbType.NVarChar, 100)
.Parameters.Add("@DisplayName", SqlDbType.NVarChar, 200)
.Parameters.Add("@FileName", SqlDbType.NVarChar, 255)
.Parameters.Add("@FileSize", SqlDbType.Real)
.Parameters.Add("@FileData", SqlDbType.Image)
.Parameters.Add("@ContentType", SqlDbType.NVarChar, 50)
.Parameters("@ECOID").Value = ECOID
.Parameters("@FromType").Value = From
.Parameters("@DocName").Value = DocName
.Parameters("@OldRev").Value = OldRev
.Parameters("@NewRev").Value = NewRev
.Parameters("@NTLogin").Value = NTLogon
.Parameters("@DisplayName").Value = DisplayName
.Parameters("@FileName").Value = sFileName
.Parameters("@FileSize").Value = iLength
.Parameters("@FileData").Value = bytContent
.Parameters("@ContentType").Value = sContentType
.ExecuteNonQuery()
'.ExecuteScalar()
End With
Catch ex As Exception
Response.Write(ex)
'Handle your database error here
conn.Close()
End Try

View 1 Replies View Related

Restoring An SQL 2005 Server Express DB With A LOG That Is To Large

Jul 31, 2006

SQL 2005 Express - Database Restore size problem
Was this post helpful ?














Hi,

I'm trying to restore a SQL Server DB Backup from a SQL Server DB Server on to my Laptop (SQL 2005 Express)

When I execute a restore filelistonly command on the backup file, It seems that the Database included is 1GB, but the Log file is 91 GB in size, which exceeds my diskspace.

I can restore the Data on its own without the log file, but the the Database stays in "restoring" mode. I've tried to switch the restore flag off (update sys.databases set state = 0 where name = 'G001'), but I can`t seem to be able to do it, even if I try to allow updates via:

sp_configure 'allow updates', 1

GO

RECONFIGURE WITH OVERRIDE

GO



Any ideas how I can restore the database without restoring the enormous logfile?

Thanks in advance...



View 3 Replies View Related

Reading Large Text Files With 2005 CE?

Dec 19, 2007

Hi€¦
During my web search looking for a solution I ran across SQL CE 3.5 articles. My questions about SQL CE 3.5 are:
1) Can SQL CE 3.5 handle a 4 €“ 6 GB file
- Read
- Parse (SQL)
2) Can SQL CE 3.5 act as a standalone client that a user can view a large (4-6 GB) text file?
- Will I need a .NET (small) client to read the large (4-6 GB) text file?
More info:
The text file will reside on the machine where the SQL CE 3.5 is installed. There is no pull to get the data.

Thank you (in advance)€¦

SQL CE 3.5

View 3 Replies View Related

Upgrading SQL 2000 To SQL 2005 Very Large Database

May 29, 2007

My question is two fold:



We have a database 65 GB in size and has grown over 12 years.



1) How can I upgrade to 2005 without downtime?

2) Our upgrades on SQL 2000 now can take upwards of 10 hours to add just a column and rebuild index tables?



Any way we can speed this up without detaching the database and going offline?



thanks,

Larry Sitka

View 6 Replies View Related

SQL Server 2005 Large IO And Logical Reads

May 22, 2008

A table in one of my databases is running very slowly. The IO is very high and below is a printout from the SET STATISTICS IO ON command run on a common query used on the table:


(4162 row(s) affected)

Table 'WebProxyLog'. Scan count 3, logical reads 873660, physical reads 3493, read-ahead reads 505939, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

I have a clustered unique index and a nonclustered index on the table. I have ran SQL Profiler and opened the trace in Database Tuning Advisor, DTA displays 0% improvement suggestions. I have a number of statistics on the table and index which are all up to date and fragmentation is less than 1%. I've tried a number of variations on indexes to improve performance but to no avail. There is only one query which runs on the table, and the nonclustered index created on the table did significantly improve performance, however the query still runs at around 23 seconds. The query does bring back a large amount of data however i'm sure there is a way to bring down the IO and logical reads to improve performance.

The table and index scripts are below:




Code Snippet
-- =================== Table and Clustered index ===========================
CREATE TABLE [dbo].[WebProxyLog](
[ClientIP] [bigint] NULL,
[ClientUserName] [nvarchar](514) NULL,
[ClientAgent] [varchar](128) NULL,
[ClientAuthenticate] [smallint] NULL,
[logTime] [datetime] NULL,
[servername] [nvarchar](32) NULL,
[DestHost] [varchar](255) NULL,
[DestHostIP] [bigint] NULL,
[DestHostPort] [int] NULL,
[bytesrecvd] [bigint] NULL,
[bytessent] [bigint] NULL,
[protocol] [varchar](12) NULL,
[transport] [varchar](8) NULL,
[operation] [varchar](24) NULL,
[uri] [varchar](2048) NULL,
[mimetype] [varchar](32) NULL,
[objectsource] [smallint] NULL,
[rule] [nvarchar](128) NULL,
[SrcNetwork] [nvarchar](128) NULL,
[DstNetwork] [nvarchar](128) NULL,
[Action] [smallint] NULL,
[WebProxyLogid] [int] IDENTITY(1,1) NOT NULL,
CONSTRAINT [pk_webproxylog_webproxylogid] PRIMARY KEY CLUSTERED
(
[WebProxyLogid] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

-- =================== Nonclustered Index ===========================

CREATE NONCLUSTERED INDEX [dta_ix_WebProxyLog_Kaction_clientusername_logtime_uri_mimetype_webproxylogid] ON [dbo].[WebProxyLog]
(
[Action] ASC
)
INCLUDE ( [ClientUserName],
[logTime],
[uri],
[mimetype],
[WebProxyLogid]) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

-- =================== Query which is called regularly on the table ===========================

SELECT [User] = CASE
WHEN LEFT(clientusername,3) = domain' THEN RIGHT(clientusername,LEN(clientusername) - 3)
ELSE clientusername
END,
logtime AS [Date],
desthost AS [Site],
uri AS [Actual Site]
FROM webproxylog
WHERE CONVERT(Datetime,CONVERT(VarChar(25),logtime,106),106) BETWEEN '20 apr 2008' AND '14 may 2008'
AND(RIGHT(uri,4) NOT IN('.css','.jpg','.gif','.png','.bmp','.vbs'))
AND (RIGHT(uri,3) NOT IN('.js'))
AND LEFT(mimetype,6) = 'text/h'
AND (uri NOT LIKE '%sometext.local%')
AND (uri NOT LIKE '%sometext.co.uk%')
AND [action] = 9
AND (clientusername IN ('USERNAME'))
ORDER BY logtime ASC;





PS There are 60,078,605 rows in the table

Please help!

Many Thanks

View 6 Replies View Related

Data Access To Large Tables In Sql 2005

Mar 1, 2007

hi all,

i have a large table in sql server 2005 (it has about 6 columns and 10 million records).

i need to work in a linear way on all the records (i know it sounds dumb but i need to work on all records).

now, obviously when trying to work on this table sql server get stuck for timeout or something like that...



i've noticed that a simple function like "select top 100 * from ExportTable" still works.

is there any way to have sql send me the data when it access it so that i'll still be able to proccess it on the same time, i basically work using dataset so that fixing the timeout wont be helpfull since windows probably wont allow me to load this amount of data into memory.



can any1 help?

Z

View 1 Replies View Related

Read Large Binary Data From Sql Server 2005

Jul 14, 2007

Hi I've followed a tutorial on how to write and read varbinary(max) data to and from a database. But when i try to read the data i get the error that the data would be truncated, but only when the varbinary(max)  is greater then 8kB. I've used a system stored procedure (sp_tableoption) to set the table that holds the data to store data outside rows. To select the data i'm using a stored procedure:               SELECT imageData , MIMEType FROM Pictures WHERE (imageTitle = @imageTitle)        And then using an .aspx page to Response.Write the data:Using conn As New sql.SqlConnection            conn.ConnectionString = ConfigurationManager.ConnectionStrings("myConnectionString").ToString            Dim getLogoCommand As New sql.SqlCommand            getLogoCommand.CommandType = Data.CommandType.StoredProcedure            getLogoCommand.CommandText = "GetPicture"            getLogoCommand.Connection = conn            Dim imageTitleParameter As New sql.SqlParameter("@imageTitle", Data.SqlDbType.NVarChar, 200)            imageTitleParameter.Value = Request("imageTitle")            imageTitleParameter.Direction = Data.ParameterDirection.Input            getLogoCommand.Parameters.Add(imageTitleParameter)            conn.Open()            Using logoReader As sql.SqlDataReader = getLogoCommand.ExecuteReader                logoReader.Read()                If logoReader.HasRows = True Then                    Response.Clear()                    Response.ContentType = logoReader("MIMEtype").ToString()                    Response.BinaryWrite(logoReader("imageData"))                End If            End Using            conn.Close()        End Using  Can anyone please help me with this?!

View 2 Replies View Related

Access One Billion Large Image Records In Ms-SQL 2005

Jan 31, 2008

hi,
 I want to Display the image records of each employee of having 10 images of each.the number of records are more then one billion.......
please provid me the optimized query  that can i used in SP and display in Gridview. 

View 7 Replies View Related

Large SQL Update - Effect On SQL 2005 Transactional Replication

Feb 1, 2007

I'm a newbie to Replication and recently setup the following.

Publisher and Distributor on the same SQL2005 server, then I've got 7 subscribers(SQL2000 servers) and I'm using push subscriptions. I'm replicating 5 SQl tables which don't have too many changes and these are scheduled to run every 3 hours. In a few days a large one off SQL update with add an additional 10,000 rows to one of the replicated tables. I was wondering what impact this would have on the above setup i.e are there any sort of limitations here. I'm assuming not but thought I would check. I'm thinking it will just cause additional overhead on the server, but the update is being applied when no users will be using the database.

Any feedback greatly appreciated.

Thanks

View 1 Replies View Related

SQL Server 2005 Slows Down After A Large Number Of Queries

Jul 4, 2007

Hi,

We are running SQL Server 2005 Ent Edition with SP2 on a Windows 2003 Ent. Server SP2 with Intel E6600 Dual core CPU and 4GB of RAM. We have an C# application which perform a large number of calculation that run in a loop. The application first load transactions that needs to be updated and then goes to each one of the rows, query another table get some values and update the transaction.

I have set a limit of 2GB of RAM for SQL server and when I run the application, it performs 5 records update (the process described above) per second. After roughly 10,000 records, the application slows down to about 1 record per second. I have tried to examine the activity monitor however I can't find anything that might indicate what's causing this.

I have read that there are some known issues with Hyper-Threaded CPUs however since my CPU is Dual-core, I do not know if the issue applies to those CPUs too and I have no one to disable one core in the bios.

The only thing that I have noticed is that if I change the Max Degree of Parallelism when the server slows down (I.e. From 0 to 1 and then back to 0), the server speeds up for another 10,000 records update and then slows down. Does anyone has an idea of what's causing it? What does the property change do that make the server speed up again?

If there is no solution for this problem, does anyone know if there is a stored procedure or anything else than can be used programmatically to speed up the server when it slows down? (This is not the optimal solution however I will use it as a workaround)

Any advice will be greatly appreciated.

Thanks,
Joe

View 3 Replies View Related

Best Way To Insert Large Amounts Of Data From A Webform To SQL Server 2005

Oct 21, 2007

HiI have a VB.net web page which generates a datatable of values (3 columns and on average about 1000-3000 rows).What is the best way to get this data table into an SQL Server? I can create a table on SQL Server no problem but I've found simply looping through the datatable and doing 1000-3000 insert statements is slow (a few seconds). I'd like to make this as streamlined as possible so was wondering is there is a native way to insert all records in a batch via ADO.net or something.Any ideas?ThanksEd

View 1 Replies View Related

SQL SERVER 2005 Database Mirroring For Large Number Of Databases

May 30, 2006

I am trying to enable database mirroring for 100 database.
It goes error free till 59 databases (some times 60 databases) with the
status (principal, synchronized) on principal. on the 60th or 61st database
it gave the status (principal, disconnected). Also mirror starts acting
abnormal. connection to mirror starts to give connection timeout and it is
not enabling database mirroring on any more databases. I have SQL SERVER
2005 Enterprise with SP1 on the servers. witness is not included yet.

this are my test servers... i have more than 500 databases on my production
servers.

principal and mirror both are using port 5022 for ENDPOINT communication.

View 1 Replies View Related

SQL SERVER 2005 DATABASE MIRRORING For Large Number Of Databases

Jun 1, 2006

I am trying to enable database mirroring for 100 database.
It goes error free till 59 databases (some times 60 databases) with the
status (principal, synchronized) on principal. on the 60th or 61st database
it gave the status (principal, disconnected). Also mirror starts acting
abnormal. connection to mirror starts to give connection timeout and it is
not enabling database mirroring on any more databases. I have SQL SERVER
2005 Enterprise with SP1 on the servers. witness is not included yet.

these are my test servers... i have more than 500 databases on my production
servers.

principal and mirror both are using port 5022 for ENDPOINT communication.

All of the databases are critical and all must be included in the Database Mirroring.
so, after that I tried to implement database mirroring again......
System has 3 GB of RAM, SQL SERVER (Mirror) using 85 MB of RAM but still
giving this error while trying to enable database mirroring for 37th
Database.....

"There is insufficient system Memory to run this query"

WHY?

View 19 Replies View Related

SQL 2005 Process Growing Very Large Working With Visual Basic 6

Sep 25, 2006

We recently installed SQL server 2005 on a couple of our servers.  I use Visual Basic 6.0 at the moment and use ADO to connect to our various SQL servers.

I recently discovered on one of the new servers, that every time my programs runs, (every 4 minutes for 12 hours a day) the SQL process shown in task manager grows by 1-10 Megs.

The SQL process was at 776,912K when I rebooted this afternoon.  It started back up at 106,120K.

I am not doing anything differently than I did when my programs were talking to SQL 2000, and I have never seen this memory leak issue.  Is there something extra I need to do in SQL 2005 to finish/clear these SQL queries and not bog down SQL's memory?







An example of how I would connect and do a SQL transaction:

Dim cn as ADODB.Connection

Dim rs as ADODB.RecordSet


Set cn = New ADODB.Connection

Set rs = New ADODB.Connection

cn.Open strConnect

select1 = "select firstName, lastName from clients"
rs.Open select1, cn, adOpenKeyset, adLockOptimistic

If rs.EOF = False Then

    rs.AddNew

End If


rs!firstName = Trim(Text1(0))
rs!lastName = Trim(Text1(1))

rs.Update

rs.Close
cn.Close

At the end of the program's run I would:

Set cn = Nothing

Set rs = Nothing

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved