How To Transfer Large Numbers Of Records Programmatically?

Nov 26, 2007

Hello,

I have a VB.NET Windows Application with a SQL 2K5 back end.

The application creates a view which produces approximately 5 million records. I am attempting to transfer these records to a table programmatically, but every attempt causes a timeout. The name of the source server and view will change depending on the client's installation.

Using the standard SQL connection/command objects with the "INSERT INTO TABLE (SELECT * FROM VIEW)" doesn't work at all.

What is the best way to transfer these records programmatically?

Thanks,
-Torrwin

View 5 Replies


ADVERTISEMENT

Working With Large Numbers

Nov 23, 2005

I need to compute factorials, but I hit a limit around 170!. Is thereanyway to handle numbers larger than the float data type can handle?Thanks.

View 3 Replies View Related

Documenting Large Numbers Of SPs

Jul 27, 2007

Hi All,
I have a large number of SPs that I would like to be able to document and provide this documentation to prospective clients. That is, provide them enough information without giving them the source code for the procedures.

I have found that all the parameters are in the sys.parameters table.

But I was wondering. Are the fields that are sent back out of an SP captured and recorded somewhere in the SQL Server catalog?

Is there an easy way to find out what fields are coming out of an SP?

Thanks in Advance

Peter

View 5 Replies View Related

Dealing With Large Numbers Of Parameters

Sep 11, 2007

Hi there,

I am getting a headache trying to research what to do when you have a large number of parameters to include in a query. For example, if I have a large number of checkboxes for the user to pick criteria for a report and they select several, I'm assuming it would be bad practise to say:

WHERE Field = "a" OR Field = "b" OR Field = "c" OR Field = "d" OR Field = "e" OR.....etc etc etc

Is there a good solution for this, given that the number of parameters may vary dramatically depending on what the user selects to include in a report?!

I'm running SQL Server 2000 with an ASP front end.

Any help would be greatly appreciated!

Thanks in advance!

Matt

View 9 Replies View Related

Large Numbers Of Orphaned/expired Requests

Mar 30, 2007

We are experiencing a situation where the SRS (2000 SP2) report server will no longer render reports. In the log file, there are many instances of

w3wp!runningjobs!434!3/23/2007-10:12:57:: i INFO: Adding: 8 running jobs to the databasew3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsClientConnected; found orphaned requestw3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsClientConnected; found orphaned requestw3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsClientConnected; found orphaned requestw3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsClientConnected; found orphaned requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsClientConnected; found orphaned requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsClientConnected; found orphaned requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsExpired; found expired request

What could be causing this? The reports are making queries through an OLE DB provider. There are no scheduled jobs, and the load doesn't seem that heavy.

View 1 Replies View Related

Efficiently Creating Random Numbers In Very Large Table

Jan 19, 2007

Hello,

I need to sample data in a very large table in SQL Server 2000 (a gazillion rows of Performance Monitor statitics).

I'd like to take the top 5%, for instance, based upon a column containing random numbers.

Can anyone suggest a highly efficient method of populating a column with random numbers.

Thanks in advance.

Rod

View 10 Replies View Related

I Want To Transfer ONLY New Records AND Update Any Modified Records From Oracle Into SQL Server Using DTS

Jul 23, 2005

I need a little help here..I want to transfer ONLY new records AND update any modified recordsfrom Oracle into SQL Server using DTS. How should I go about it?a) how do I use global variable to get max date.Where and what DTS task should I use to complete the job? Data DrivenQuery? Transform data task? How ? can u give me samples. Perhaps youcan email me the Demo Package as well.b) so far, what I did was,- I have datemodified field in my Oracle table so that I can comparewith datelastrun of my DTS package to get new records- records in Oracle having datemodified >Max(datelastrun), and transferto SQL Server table.Now, I am stuck as to where should I proceed - how can I transfer theserecords?Hope u can give me some lights. Thank you in advance.

View 2 Replies View Related

SQL Server 2014 :: Index Dates To Numbers With A Large Data Set?

Jun 16, 2015

I am trying to index dates to numbers with a large data set.

The first colums is index, the next is FactorsS, the next is value and the next is Date and the last is Lag.

Would it be difficult to write code that would determine the lag values. The lag value is based on the date value.

Index FactorS Value Date Lag
1 XYZ 2.3 12/31/2014 1
2 XYZ 1.4 12/30/2014 2
3 XYZ 3.3 12/29/2014 3
4 ABC 1.8 12/31/2014 1
5 ABC 2.2 12/30/2014 2
6 CBA 1.7 12/31/2014 1
7 CBA 1.8 12/30/2014 2
8 CBA 1.9 12/29/2014 3
9 CBA 2.1 12/28/2014 4

View 9 Replies View Related

Viewing Large Numbers Of Stored Procs In SQL Server Management Studio

Jul 13, 2007

I work with a large and complex reporting system with several hundred reports: the Programmability; Stored Procedures node of the object explorer has become very difficult to navigate.



Is there any metadata that can be embedded in the stored procs that would create subfolders like the existing System Stored Procedures node in this node of the object explorer?



I suspect that the correct answers are:

Rename all your queries with a rational naming convention;
Cull the deadwood;
Assign them to categories then export them to separate databases.

Unfortunately, one of these has already been done, and the other two will break several hundred dependent processes - the recoding and retesting is neither economical nor desirable.



Still, all advice is welcome. Pitch your answers at a banking geek who does intermediate to advanced stored procedures and triggers, but isn't allowed to play with sharp things (like sys objects) - but I can probably get help from a grown-up on the sysadmin team: I have discovered that the rumours about human sacrifice are baseless, and they will perform favours in return for beer.



This is also a good time to ask: just how many stored procs and functions are you allowed in SQL Server 2005?



Nile.

View 5 Replies View Related

Modifying Records In SQL Database Programmatically

Jan 23, 2008

VWD 2005 Express.  Visual Basic.  SQL Server 2005.
I know how to set up SqlDataSources and their insert, delete, update, and select commands.  I also have code for querying a Sql table and populating a dataset and scanning the dataset for values.  However, I do not know how to modify records in a Sql table programmatically.  Here is what I need to do:
I need to open a Sql table.
I need to process record bny record to check a particular field for a particular value.
If the field has the particular value, I need to change it and write the record back to the table with the modification.
The table is named "SystemUser" and the field is call "SystemUserTypeId." 
Can anyone provide me with sample code that would accomplish this?  Thanks for the help.

View 5 Replies View Related

Inserting Records Using The Details View Or Programmatically

Aug 26, 2006

I'm a new user to vwd.  If I use a details view control on my page, I have noticed that the "New" link is not visible unless there is at least one record in the table. Is there any way of making it visible where there aren't any records?My web pages are currently hosted at vwdhosting. I have uploaded my database with the record structure onto the web site and I am using a remote connection string to access it. I have had users updating data in another table on the remote database. If I add records to my new table locally and upload the database to the remote site, all the data that my users have been adding will be lost. So, if I can't add my first record using a control on my web page when there are no records in the table, should I be doing it programmatically? If so, how?Thanks,Julie 

View 2 Replies View Related

Find Rows With Records Containing More Than Two Numbers After Decimal

Jan 22, 2014

How do I find the rows that have more than two decimal numbers after the decimal point for example 2.787686

I am trying :

select amount, LEN(AMOUNT) - CHARINDEX('.', amount) as DecimalCount from GL_REPORT

but is not working.

View 14 Replies View Related

T-SQL (SS2K8) :: Select Query With Records And Sequential Numbers

Apr 5, 2014

I have a problem. In my database I have the following numbers available:

101
104
105
110
111
112
113
114

What I need is to get a select query with records and sequentials numbers after it like:

101 0
104 1 (the number 105)
105 0
110 4 (the numbers 111,112,113,114)
111 3 (the numbers 112,113,114)
112 2 (the numbers 113,114)
113 1 (the numbers 114)
114 0

How can I do It?

View 2 Replies View Related

Is There A Hack That Would Allow You To Reuse Identity Numbers That Were Orphaned By Deleted Records?

Aug 24, 2006

I guess this is a fairly common topic but couldn't find the right words to find anything in a search.

What I'm getting at, is there any tsql functions or combination of commands for the following.

You have identity columns in your tables, if you set the a seed and autoincrement, I enter in rows 1 -10 and then I delete 4, 6, 7, 8.

My next new record uses 11. Is there any logic that allows you to check and reuse 4, 6, 7 & 8 described above? Not looking for something that consists of having to create an extra ID table for each table and handle configuring what the next available number is everytime an Insert or delete is called.

Thanks.

View 4 Replies View Related

Need Help With Transfer Many Records Between Tables

Jan 27, 2008

Hi!
I have 2 tables (both have the same structure):
ID -> bigint (identity, not for replication, primary key)
Url -> nvarchar(1000)
MainUrl -> nvarchar(1000)

Tbl1 cantains about 0,5 mln records, and tbl2 - 1 mln.
What I need, is to copy records from tbl2 to tbl1. But records in tbl1 are unique, and it can't change. (Unique must be only "Url"; (and ID, but it's automatic)). How can I do this in fast way? Now I'm using SELECT for each record in tbl2 to see if it exist in tbl1. But it's a bit slow... Is there any faster method? (One thing: I'm beginner in databeses, so I'm wrote VB application to transfer records. How can I do it using only Microsoft Sql server?)
--------------
I'm forgot to write, I'm using MsSql 2005.

View 7 Replies View Related

How To Transfer Access Data Records To SQL ??

May 8, 2008

Hi,

I am facing a big problem in data transfer from Access To SQL. Well the problem is i want to transfer nr about 50 lacs record from access to sql. at my side where sql server is complete install i could not find any problem and it transfer all data in 5 to 7 mins. now problem is at client side we not install entire sql but we install sql runtime which is available with installshied. and whenever i start to transfer data there is another exe called DLLHost.exe is run and it ate near about 70% to 75% Memory of system. well i use sql OPENROWSET statement and as soon as that statement fire at machine where entire sql is not install and only runtime install it starts to eat system memory so instead of transfer all data in 5 to 10 min it take more then 6 hrs bcaz dllhost.exe eat memory. is there any better way to transfer this records more fast pls let me know.


pls. help me to get resolve this problem.

thanks for help ,

pls help me.

View 1 Replies View Related

Inserting Large Number Of Records

Jun 20, 2006

Hi!

The DB I am working with has about 10 tables and some of the tables have 200,000 to 500,000 records. All tables have a clustered index on the primary key.

I performance during INSERT could be better I think - I add thousands of records at a time from many connections.

Is there a way to defer the update of indicies? So that, I can update the tables and then let the indicies regenerate ?

thanks,

Jas.

View 4 Replies View Related

Formatting Numbers In A Mixed Column (numbers In Some Cells Strings In Other Cells) In Excel As Numbers

Feb 1, 2007

I have a report with a column which contains either a string such as "N/A" or a number such as 12. A user exports the report to Excel. In Excel the numbers are formatted as text.

I already tried to set the value as CDbl which returns error for the cells containing a string.

The requirement is to export the column to Excel with the numbers formatted as numbers and the strings such as "N/A' in the same column as string.

Any suggestions?



View 1 Replies View Related

It Takes 2 Seconds To Transfer 1000 Records Using DTS.

Mar 27, 2001

I am trying to transfer 90 million records/250 bytes row length from oracle 8i to sqlserver 2000
using DTS and it is taking 2 seconds to transfer 1000 records. Is there any way I can transfer 90 million records fast at all. This will take more than 10 hours to transfer it.

Thanks,
Ranjan

View 7 Replies View Related

Efficient Way To Transfer Huge Amount Of Records

Sep 28, 2006

Hi All,

I used a data flow task, and when trying to transfer data from a OLE DB Source (records ~ 75 lac) to a destination OLE DB Source, SSIS fails at the middle giving an error saying the Transaction log got filled, try again after clearing the same.

My query is what is the most efficient way to transfer say records more than 50 lac ensuring that it doesn't fail in the middle?

Thanks in Advance,

Mithun.

View 5 Replies View Related

INDEXIES – Inserting Of Large Amount Of Records

Feb 25, 2007

Is there any way how to create indexies after insertion of any
number of records (I dont want to create index after insertion of every record,
but for example after insertion of 1000 records) ?



 I heard it should be possible with „bulk insert“ or with
transactions. Is it right ? I need do this with MS SQL Server 2005 (Workgroup
edition).



 Thank you for your ideas!

Jan

View 5 Replies View Related

Error While Handling Large Number Of Records

Jan 29, 2008

Hi I have created one store procedure which handles global updates I am using cursor to fetch one be one row for updating (It is required for implementing business logic)Now when i execute this store procedure ---it gives me dedlock error , I dont know why i m getting this error(Approx number of rows 1.5lakh)if then i removed unnecessary records from table (Approx -50000) it works fine,Is there any way to handle itI am calling this storeprocedure from my window service.please give me a good solution if possible 

View 3 Replies View Related

Count Distinct In Large Table With 30M Records?

Jul 7, 2015

I have a task to count distinct records in a big table with roughly 30M records, performance is an important factor. Query is to be written to calculate weekly stats, weekly record number could be as high as 1M.

The actual result is like:
ID Policy
350235744Credit Cards
350235744PCI
350235744PCI Audit

So the final number for this particular Policy is 3

I can write the query like:

select count(distinct Incident_id) policy_name
from Reporting_DailyDlpDetail
Where (year(INSERT_DETECT_TS)=2015) and (month(INSERT_DETECT_TS) =6) and (day(INSERT_DETECT_TS) between 2 and 9)
This returns 526254 and costs 11 seconds to complete

or a query like:
Select distinct Incident_id, policy_name
from Reporting_DailyDlpDetail
Where (year(INSERT_DETECT_TS)=2015) and (month(INSERT_DETECT_TS) =6) and (day(INSERT_DETECT_TS) between 2 and 9)
This returns 749687 and costs roughly 1 minute to complete.

Result is different from the two queries, I believe the later gives correct number. How can I count the distinct based on a combo?Considering the size of data, what is the best and most efficient way to run the stats calculation against over 30 different scenarios (different policies and alert types) and not timeout?

View 9 Replies View Related

Unable To Load Report With Large No. Of Records

Nov 15, 2007

Hi,
We need to generate a report using SQL Server 2005 Reporting Services which can have 1 million plus records. So we have created a report with a simple select query. This query when run against our database returned 1110018 records.
We deployed the report onto our server and tried accessing the report through its URL. What we observed was that the report ran for about 15 minutes and then it prompted for user id and password (Windows authentication). On giving the user ID and password it continued running for another 15 minutes and at the end of it, the browser again prompted for user id and password. This cycle continued twice and at the end of it we got the message €œThe page cannot be displayed€? (i.e. the usual message displayed by the browser when it cannot load a web page). We were not able to load the report at all.
Are there any specific considerations when loading reports with such high no. of records. Any suggestion, solutions are welcome.

Thanks,
CodeKracker.

View 4 Replies View Related

Passing Large Amount Of Records As Single Parameter?

Jan 18, 2008

Hello all - I currently have a project that has a gridview on the front end. The user can select multiple items from this grid, hit a button and all of the selected records should be updated in the process. However, this resultset can have a large amount of data coming back, and I'm stumped on how to pass all of the ID's to the sp. I'd rather not call the SP for each record selected, as there could be 1,000 items selected, and well, I'd rather not call the SP 1000 times :p. I thought of generating a comma delimited list as I'm looping through the grid and using dynamic sql, but the IDs are about 6-7 numbers long, and including comma, would take up almost all of the max space in a varchar.
 Are there any good solutions to this problem? Passing the items as an array? Generating a data table in .NET and passing that?
 Any help would be appreciated.
-Jaime

View 4 Replies View Related

Access One Billion Large Image Records In Ms-SQL 2005

Jan 31, 2008

hi,
 I want to Display the image records of each employee of having 10 images of each.the number of records are more then one billion.......
please provid me the optimized query  that can i used in SP and display in Gridview. 

View 7 Replies View Related

Error Occur While Delete Large Volum Records

Jan 6, 2004

I had write a ActiveX service to delete several tables and those records are more than 100000. When I test it by deleted 1000 records it is ok, but once the volum is increase until 100000, it will give me a error message said timeout operation fail.

how can i overcome this problem. please!!!!

View 8 Replies View Related

SQL Server 2012 :: Delete Large Number Of Records?

Sep 8, 2015

I have the following scenario:

SQL database on SQL 2012

Large Production table 15 Million record

The table has 3 years of data

New monthly data is being added every month.

A New Monthly data is being loaded, checked and finally approved after 6 or 7 iteration before approval.Because of this iteration the monthly data set is being added then deleted then added then deleted few times.Because the table is big this process takes time, any thoughts on how to make the delete insert process faster.Keep in mind I cannot do much because it is a production table and is being access by other users to do other analysis.

Delete is done based on trx_date which is a year/month combo, like 201508.

The table has monthly sales by customer aggregated.

The table structure is:

CREATE TABLE [dbo].[Sales](
[batch_key] [int] NOT NULL,
[Company_key] [int] NOT NULL,
[customer_key] [char](22) NOT NULL,
[Trx_Date] [int] NOT NULL,
[account] [nvarchar](35) NOT NULL,

[code].....

View 9 Replies View Related

Complexity Problem With Large Amount Of Records In A Link Table

Feb 1, 2008

A friend reminded me of a problem we tried to solve a few years ago and were unsuccessful.  Below is a copy of the email he sent me.  We would very much appreciate any ideas from the community.  Thanks!Lets start with a simple schema where you have 4
tables:

View 3 Replies View Related

SQL Server 2005 Full Text Performance With Large Number Of Records

Dec 12, 2007

Hi
We are using the SQL Server 2005 Full Text Service. The data is not huge, but the kind of data is that each record is small and there are a large number of records. There are 35 million records now with 11 GB of data and about 1.6 GB of FT catalog on the table. This is expected to grow to at least 10 times the size of this data. The issue is with FTS taking a long time to return results when the number of hits (rows) getting returned from FTS is large for some searches, it takes a very long time. With the same data & catalog, those full text queries for less common words return timely. The nature of the problem doesnt allow us to only have top results. We need all the results. So it’s not about the size of data but the number of results getting returned from FT. (As the catalog is inverted). The machine is dual processor with 4 GB RAM.
 
I am considering splitting the table and hence the catalog and using multiple servers to do full text searches in smaller catalogs. Is there any other way this issue can be solved ?
 
If splitting is the only way, can you give me an idea as to what is a statistical/standard limit to the number of search results/cataog size as which FTS gives good results
 
Thanks in advance

View 1 Replies View Related

SQL Server 2012 :: How To Quickly Update / Insert 3M Records In Large Table

Mar 28, 2015

Our system runs a SQL Server 2012 DB, it has a table (table_a) which has over 10M records. Our system have to receive data file from previous system daily which contains approximate 3M updated or new records for table_a. My job is to update table_a with the new data.

The initial solution is:

1 Create a table (table_b) which structur is as the same as table_a

2 Use BCP to import updated records into table_b

3 Remove outdated data in table_a:
delete from table_a inner join table_b on table_a.key_fileds = table_b.key_fields

4 Append updated or new data into table_a:
insert into table_a select * from table_b

As the test result, this solution is very inefficient. Step 3 costs several hours, e.g. How can I improve it?

View 9 Replies View Related

SQL Server 2008 :: Making Use Of A Large Transaction File To Delete Records?

Jun 5, 2015

Currently we has a database of size about 300G. Because our backup system failed some time past we were left with a transaction log file which grew to about 160G. However our backups are working again and everything is working fine. My understanding is that now the transaction log file is practically empty but the capacity remains at 160G.

When you delete records the deleted transactions are going to get logged to the transaction file. My understanding is when a backup is done these transactions get discarded out of the transaction file.

could I make use of this relatively large transaction file and start deleting transactions without out actually adding to the transaction file size.

The plan is to delete records from logging tables that are not referenced to by any other table without this increasing the transaction log file.For example over a period of a few weeks we can delete a chunk of records from a table. Then after it has completed a backup we can delete another chunk of records out of this table until we have got the table down to the records that we now need.Will this work?

View 2 Replies View Related

Group / Union Statement - Pull Unique Records From A Large Table

Sep 22, 2014

I am trying to use SQL to pull unique records from a large table. The table consists of people with in and out dates. Some people have duplicate entries with the same IN and OUT dates, others have duplicate IN dates but sometimes are missing an OUT date, and some don’t have an IN date but have an OUT date.

What I need to do is pull a report of all Unique Names with Unique IN and OUT dates (and not pull duplicate IN and OUT dates based on the Name).

I have tried 2 statements:

#1:
SELECT DISTINCT tblTable1.Name, tblTable1.INDate
FROM tblTable1
WHERE (((tblTable1.Priority)="high") AND ((tblTable1.ReportDate)>#12/27/2013#))
GROUP BY tblTable1.Name, tblTable1.INDate
ORDER BY tblTable1.Name;

#2:
SELECT DISTINCT tblTable1.Name, tblTable1.INDate
FROM tblTable1
WHERE (((tblTable1.Priority)="high") AND ((tblTable1.ReportDate)>#12/27/2013#))
UNION SELECT DISTINCT tblTable1.Name, tblTable1.INDate
FROM tblTable1
WHERE (((tblTable1.Priority)="high") AND ((tblTable1.ReportDate)>#12/27/2013#));

Both of these work great… until I the OUT date. Once it starts to pull the outdate, it also pulls all those who have a duplicate IN date but the OUT date is missing.

Example:

NameINOUT
John Smith1/1/20141/2/2014
John Smith1/1/2014(blank)

I am very new to SQL and I am pretty sure I am missing something very simple… Is there a statement that can filter to ensure no duplicates appear on the query?

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved