Are Data Sets Memory Efficient?

Feb 8, 2008



Hi,

I have designed a contact manager with Data Grid Control bound to a Data Set.

When the application closes, data from Data Set is written to XML file and when application opens, data from XML file is loded into Data Set and is show in Data Grid control.

Contacts in my application can exceed over 1,000
So, Is Data Set capable of handling lot of data very efficiently in memory?




Please advise

View 3 Replies


ADVERTISEMENT

Most Efficient Way Of Returning Data

Feb 22, 2007

I have a number of tables, and I need to create a summary of the data.Now I have two choices. I could create a stored procedure that creates a temp table with the filters applied. This would require 12 selects for each year (we have 3 years of data so far). This means 36 selects so far that fill a temp table.The second choice is to create a table with the required columns and rows. As I am esspecially cross-joining 4 tables, the table would have about 10 x 10 x 12 x A rows, where A is a variable that will grow quickly. I can then keep this table up-to-date using triggers for each insert. Then all I need to do is use an aggregate funtion on the relevant filter.My question is, which is more efficient. Creating a stored procedure that creates the table dynamically, or have a table with thousands of rows and using an aggregate function on these.PS I am using SQL Server 2000Jag 

View 1 Replies View Related

Efficient Way Of Backing Up Data

Jan 24, 2008

We are writing an in-house backup tools. It's written in C# and the program allows me to select which records we want to archive and remove from existing database. Functionally, we can acheive what we want to do but I feel that we did not do it right. Can someone show me the right way to go? Here is what we are doing in the backup process.

1) Create an archive database (with the file name, say archive_101.mdf)
2) Move the data from orginal database to archive database using the following SQL statement
INSERT INTO [archive_database].dbo.records SELECT a, b, c, date FROM [original_database] WHERE date < CONVERT(DATETIME, '2007-10-10', 120)
3) Delete the moved data from original database.
4) Of course, step 2 and step3 are done in a transaction
5) Then, we detach the archive database and store the file "archive_101.mdf somewhere.

I felt it's wrong because the performance is very bad. If I run the select statement, it takes me 1 min to dump all the data to console. If I run the same SELECT statement together with INSERT INTO, like what I wrote in step 2, it takes me more than 1 hour to write to another database. I checked the tempdb and its size does grow a lot when I am doing step 2. I know the data I am selecting is about 500MB large but still, it should not take that long. Can somebody give me any hints?

Thanks in advance.

View 5 Replies View Related

Efficient Way To Store Game Data: Help?

Jun 22, 2007

Hello everyone!



So I am trying to wrap my head around the most efficient way to store some game data in sql server 2005.

I wont list all of my tables or columns as there would just be too many, but will try to post as exact as I can.



Basically lets say I have 3000 monsters in my game, and over 8000 items some of which are monster loot.

Each monster could have 1 drop item or they could have 200 different item drops all depending on the monster.



So I am just trying to find the best method of creating a loot table.



So I have the following in mind.



[MonsterTable]

monsterID;

monsterName;

etc...



[ItemTable]

itemID;

itemName;

etc..



[LootTable]

lootTableID;

refMonsterID;

refItemID;



So I'm not worried about how I would get all of the data in there, I'm just wondering if this simple method of using Ref. ID's between tables would provide the best method for extracting the data when needed.



Any ideas or thoughts are more then welcome, any questions just ask, and thanks for any guidance or information you can provide on a more efficient manner.

View 1 Replies View Related

Power Pivot :: External Access To Data Sets In The Data Catalog?

Apr 23, 2015

I'm currently working on a BI architecture for a customer, and consider to propose the Power BI data catalog as a data distribution layer. The customer will use Power BI, but also has other BI tools.

Are data sets in the data catalog available to other clients than Power Query alone? E.g. are there OData feed endpoints available? If not, what would be the best way to give other tools access to the data?

View 3 Replies View Related

How To Split The Data Into Training And Validation Sets When Doing Data Mining?

Jun 15, 2007

Could I ask how to spit the data into training and validation sets when doing data mining?



Thanks

View 1 Replies View Related

Join Three Data Sets From Different Data Flows Into One Txt File

Mar 9, 2008

Hi, I was wondering how it is posible to join three data sets from different data flows into one txt file.
Let's explain a little more:


I have 3 dataflows. Each of them connect to sql server and and by a SQL command, they bring data into SSIS.

Each SQL command differ between them. So each data set have different columns (they dont have the same format). Also the amount of columns differ between each one.

What I need is to join the three data sets into one txt file. How can I do this? It is posible to join them with different data set formats into a txt file?

Is this the best way to join different data? It is better to use as many OLE DB Sources are needed instead of different data flows?
Thanks for your help!

View 7 Replies View Related

Doing A Data Import Using DTS Wizard In SQL Server 2005 - Being Efficient With 5 Flat Files

Apr 13, 2006

Hi,

I'm a new user of SQL Server 2005. I have the full version installed. I also have SQL Server Business Integration Dev Studio installed. My OS is Windows XP.

I'm importing a series of 5 flat files into a database on one of the SQL Servers we have. My goal is to get 5 different tables (though perhaps I should do one and add an extra field to distinguish each import) into the database for further analysis.

I tried doing an import via DTS Wizard. There are no column names in the flat file so I defined them during the import process (all 58 of them). When I got to the end, I had an option to save the import process as a SSIS (SQL Server Integration Service) Package on:

SQL SERVER (I don't have permission for this)

or

FILE SYSTEM (did this one)

I saved the Package locally in hopes of being able to go back in, change the source file and destination table of the package and quickly get the other 4 flat files imported.

My problems are:

1) I couldn't find how to run the *.DTSX Package file to run in SQL Server Studio (basically reuse the Package with minor changes and saving me having to redefine the same 58 columns on each flat file import)

2) Tried but didn't understand how to run it in SQL Server Bus Intel Dev Studio (i.e. understanding the mapping and getting the data types right so it wouldn't error out)

3) Don't know how to make the necessary changes so that the Package handles the next source file and puts in a new destination table (do I need to do 5 CREATE TABLES so this Package has a place to run to?)

4) Does the Package need to be part of a Project to run (I haven't found how to take an existing Package and make it part of a Project/Solution)?

5) Is there a good book or online resource for just getting the basics of using SQL Server 2005 and SQL Server Business Intelligence Development Studio?

I'm really at a loss after spending a day fruitlessly on it scouring the help files, forums and experimenting around.

Hope somebody can point me in the right direction.

Regards,

Patrick Briggs,
Pasadena, CA


View 7 Replies View Related

Doing A Data Import Using DTS Wizard In SQL Server 2005 - Being Efficient With 5 Flat Files

Apr 18, 2006

I just spent some time working out how to do a seemingly simple task. I€™m sharing the steps I took to do this in hopes it saves other SQL Server 2005 users (especially newbies like myself) time.

My original question posed on several SQL newsgroups was based on this goal:


I'm importing a series of 5 flat files (all with same file layout) into a database on one of the SQL Servers we have using SQL Server 2005 (SQL Server Management Studio) . My goal is to get 5 different tables. I want to do this without having to redo all the layout criteria 4 additional times.

Below are the steps I followed to get a solution (all done in Microsoft SQL Server Management Studio):

Create the Package (data import)

1) Use the SQL Server Import Export Wizard (equivalent to SQL Server 2000 Data Transfer Wizard) to import your first flat file. At the CHOOSE DATA SOURCE window browse for your file.
2) Under the Advanced tab, you can set your Column attributes (€œoutput column width€? or €œdata type€? to name a few). I highlighted all the columns and selected €œstring [DT_STR]€? for data type. To avoid truncation errors, I selected 255 for output column width. You can name the columns whose data you are most concerned with (I did import all the available fields).
3) After choosing a server destination you will have a €œSELECT SOURCE TABLES AND VIEWS€? window pop up. Under the €œMapping€? column you can choose to tweak your mapping further editing in SQL (see Edit SQL button). I didn€™t.
4) The €œSAVE AND EXECUTE PACKAGE€? will pop up. The €œExecute Immediately€? box should be checked and you should check the €œSave SSIS Package€? (SQL Server Integration Services). When you do, select €œFile System€? for where to save this import-file-package to.
5) Click OKAY for the Package Protection Level and the €œSAVE SSIS PACKAGE€? window will appear. Browse for a path on your local computer to save to.

Modify Package (data import) for Next Use

6) In SQL Server Management Studio, browse for the Package and open it.

Preparation for SQL Task €“ box

7) You should see a screen that shows two boxes (€œPreparation for SQL Task€?) and (€œData Flow Task€?).
8) Right click on the former and select €œEdit€?.
9) On the €œSQL Statement€? row, click into the right column and select the €œ€¦€? box
10) Change the destination table (the table you will create with this package) to a meaningful name and click OK.
11) Click OK for the €œSQL Task Editor€?

Data Flow Task - box

12) Right click on the €œData Flow Task€? box and select €œEdit€?.
13) Three boxes will appear €œSourceConnectionFlatFile€?, €œData Conversion 1€?, and €œDestination - <whatever table name your original data import went to>€?. Below them is a section that displays €œConnection Managers€?

SourceConnectionFlatFile - editing

14) The first thing you will want to do is change the import source to a new flat file. You do this by going below the boxes under the €œConnection Managers€? window and right clicking on €œSourceConnectionFlatFile€? and then selecting €œEdit€?
15) Browse for the new €œFile Name€? and select it.
16) A €œMicrosoft SQL Server Management Studio€? window will pop up asking you if you want to €œkeep or reset the existing metadata€?. The metadata is just your column definitions and choosing €œYES€? to keep this makes sense if you are doing data imports on files with the same file layout.
17) Still in the €œFlat File Connection Manager Editor€? window, change the €œConnection Manager Name€? to something meaningful (I add <_> at the end and then the name of the table the flat file is going to) and click OK.

SourceConnectionFlatFile €“ box (editing)

18) Right click on the €œSourceConnectionFlatFile€? box and select €œEdit€?.
19) Your newly named €œFlat File Connection Manager€? should appear in select box.
20) Click OK, right click again on the €œSourceConnectionFlatFile€? box and select €œShow Advanced Editor€?.
21) Under the €œConnections Manager€? tab, your newly named €œFlat File Connection€? should appear (the prior step is necessary for the advanced editor to recognize your change).
22) Under the €œComponent Properties€? tab, on the €œName€? row, click into the right column and rename to something meaningful (notice the €œIdentification String€? row description changes too once you click out of the €œName€? row)
23) Under the €œColumn Mappings€? tab, just confirm you are mapping your flat file fields (€œAvailable External Columns€?) to a destination table€™s fields (€œAvailable Output Columns€?).
24) Under the €œInput and Output Properties€? tab you can check in €œFlat File Source Output€? to make modifications to either your €œExternal Columns€? or your €œOutput Columns€? €“ you shouldn€™t need to for a simple import.
((NOTE: any changes you make here would likely need to be consistent with the column properties found under the €œConnection Manager Window€? for the €œSourceConnectionFlatFile€? as well as the €œData Conversion 1€? box under the €œData Flow Tasks€? window, so exercise caution
25) NOTE: This process has worked for me by making my source columns all €œstring [DT_STR]€? data type and the output columns all €œUnicode String [DT_WSTR]€? data type.

Data Conversion 1 €“ box (editing)

26) There is nothing you need to do here. By right clicking on the €œData Conversion 1€? box and selecting €œEdit€?, you can see and change the data type of the output columns (the ones in the table your importing the flat file to). There are probably more edits one can do but they€™re beyond what I€™ve learned.

Destination - <whatever table name your original data import went to> €“ box (editing)

27) Right click on the €œDestination - <whatever table name your original data import went to>€? box and select €œShow Advanced Editor€?.
28) Select the €œComponent Properties€? tab.
29) Select the right column at the €œName€? row and change the name to something meaningful (ie. related to the source file name or the table name you€™re importing to).
30) Select the right column at the €œIdentification String€? row and it will update to this change.
31) Select the right column at the €œOpenRowSet€? and change it to the name of the table you are importing your flat file to (this should be consistent with table name under step 10).
32) Click OK
33) Select FILE and select €œSave As€¦€? and then give your package a new name that€™s meaningful (this will be helpful if you have to rerun the import of the flat file later).

Run (execute) the Revised Package (data import)

34) Go back to SQL Server Management Studio and open the Object Explorer
35) Connect to an €œIntegration Services€? component. This should essentially be a local instance (not sure where it is on the local computer or in SQL Server Management Studio on the local computer).
36) In €œObject Explorer€? go down to your €œIntegration Services€? object and expand it.
37) Expand €œStored Packages€?
38) Right click on €œFile System€? and select €œImport Package€? and an €œIMPORT PACKAGE€? window will appear
39) For €œPackage Location€? choose €œFile System€? and then browse for the €œPackage Path€?
40) Click into the €œPackage Name€? and it defaults to your Package€™s file name.
41) Click OK and the Package is imported.
42) Right click on the newly imported Package and select €œRun Package€?
43) An €œExecute Package Utility€? window appears
44) Select €œExecute€? and the package runs.

View 1 Replies View Related

Large Data Sets

Mar 20, 2008

Hi,
 I'm currently trying to retrieve results from a large dataset, there are over 45000 records and I need to use them all to peform counts etc.  I have set up views, but my page is still being returned slowly, is there anything I can do to speed this up?
 Thanks
 Gemma

View 2 Replies View Related

Query With 2 Sets Of Data

May 7, 2008

I am trying to query one table and get two different timeperiods of data, I am summing monthly totals to provide a running year total, but I also need last month's total in a seperate column. This is what I have so far but the subquery makes me group it which provides duplicate grouping.DECLARE @LASTPD AS INT
SET @LASTPD = (SELECT MAX(LASTPERIOD) FROM TABLE)
SELECT NAME,
POST_PD AS [MONTH],SUM(CHARGE_AMOUNT) AS MONTHLY_$,
LASTMONTH.LAST_MONTH,(SELECT SUM(CHARGE_AMOUNT) AS LAST_MONTH
FROM TABLE INNER JOIN TABLE2
ON TABLE2.NAME = TABLE.NAME
WHERE POST_PD = @LASTPD
AND TABLE2.NUM= 539
GROUP BY NAME) AS LASTMONTH
INTO #TEMP_SAFROM TABLE
INNER JOIN TABLE2
ON TABLE2.NAME = TABLE.NAME,(SELECT SUM(CHARGE_AMOUNT) AS LAST_MONTH
FROM TABLEWHERE TABLE2.NUM = 539
GROUP BY NAME, POST_PDORDER BY NAME, POST_PD
SELECT NAME,
             LAST_MONTH,
CAST(SUM(MONTHLY_$)AS DECIMAL(20,2)) AS YEARLY_$
FROM #TEMP_SA
GROUP BY NAME
ORDER BY NAME

View 13 Replies View Related

Matching Two Sets Of Data

Mar 12, 2008

Hi All,

I would like to match two sets of data. I have setup a view of data that contains a group of customers and their details. I want to view this data, but also find these customers in another table based on matching their surname and date of birth, then retreive the information stored on these customers from the second table.

Does anyone have any suggestions how i would go about doing this?

Thanks in advance
Humate

quote:Originally posted by Michael Valentine Jones

It takes real skill to produce something good out of a giant mess.

View 2 Replies View Related

Comparing Two Sets Of Data

Jul 23, 2005

I have the following situation. One set of data has 274 rows (set2)and anther has 264 (set1). Both data sets are similar in structure aswell as values for both of them were extracts from the same parenttable. Hope the info would substitute DDL. I need to find the "gap"rows between these two sets.Attempted to run a query likeselect count(*)from set2where not exists(select *from set1)did not yield what I desired. What else to try?TIA.

View 12 Replies View Related

Two Data Sets In A Matrix

Mar 13, 2007

Hi

I have a matrix whos colunm group is filed by Dataset1,
now i want to add naother colunm group,but using the Dataset2

can I use two different dataset for a matrix,
for differnt colunm group

please help me in this regards

thanks

View 2 Replies View Related

T-SQL (SS2K8) :: Matching Two Data Sets

Apr 28, 2015

I have two tables - one with sales and another with payments against those. The payment may not match the exact amount of sales and I have to use FIFO method to apply payments. The payment month must >= sales month.

How can i write a query to do this? Examples are as below.

Table 1
Sales Sale DateSale Amt
1Jun-141200
2Oct-142400
3Dec-14600
4Feb-1512000

Table 2
Pay Month Pay YearPay Amount
5 2014 300
6 2014 1000
10 2014 500
11 2014 2000
12 2014 300
1 2015 900

create table tbl1
(
saleNo int
,saleDate date
,saleAmt float
)
insert into tbl1 (saleNo, saleDate, saleAmt)

[Code] ....

View 5 Replies View Related

Train And Test Data Sets

Feb 8, 2007

I've seen that sometimes is better to split the table into a test dataset and a training dataset, and I'll appreciate if anyone can explain why is this...

thanks

Santiago Aceñolaza
Argentina

View 4 Replies View Related

Multiple Data Sets In One List

Mar 24, 2008

Is there a way to put more then one data set in a list.
I have a report that has three data sets with three tables. Now i want to show each report by Region, per page. So you can view the same stuff for each region seperately, instead of all together. Is there a way to do this. Where i dont have to go back in my code, and find a way to link everything together, so its in one data set .

View 3 Replies View Related

Dynamic Data Sets In Reports

Sep 3, 2007



Hi,

I'm using a matrix report where in i want to use two datasets in the same report. How can i make the dataset dynamic for a single report.

Regards

View 1 Replies View Related

Calculated Field Using Two Data Sets

May 12, 2008

Hi,

I'm trying to created a report.
Final report looks like this.







Total Loans/Lines (#)
13,283



Total Commitments ($ MM)
$1,703



Total Outstandings ($ MM)
$1,175







A
B
C
D
F

Bankruptcy
0
$0
$0
0.00%
0.00%

Charge Off
0
$0
$0
0.00%
0.00%


Source table looks like this;







Bankruptcy



0

Charge Off



0

CLTV
131

DSR
102

Exc Total
265

FICO
7

Foreclosure/Repossession

Grand Total
13283

Loan Amount
32


Column D = A Bankrupcy(0) / Total Loans/Lines #(13283)

But it does not let me to use report expression as its not in the same scope.

Can anyone tell me how to do this calculation ?I was trying to use a report expression but it seems like not working.

Thanks

View 15 Replies View Related

Problems With Data Sets In SQL Server 2005

May 25, 2006

Hello,
I am using existing code, which I am trying to convert from using MS Access to SQL Server 2005...
The data set works fine with MS Access database, however when executing with SQL Server 2005 as data source, it generates the following error:
"..The data types ntext and nvarchar are incompatible in the equal to operator..."
in this line:
count = adapter.Update(dataset);
Not sure what should I look for since data sets are new to me.. Where should I check to fix this problem? I have noticed that the table has two columns with nvarchar... 

View 11 Replies View Related

Calculating Admits Per 1000 From Two Data Sets

Aug 27, 2013

I have two queries that generate two different datasets. One is a count of memebers, and the other is count of admits. I need to generate a calculated field from the two data sets called admits per 1000, which is essential the count of admits/counts of members *12000 I was able to calculte admits per 1000 easily in excel, however I need some insight on how to do is SQL.

Below are my queries from the two datasets.

MemberMonths dataset:
Select
factMembership.BusinessUnitCode,
EffectiveCCYYMM,
ISNULL(count(Distinct MemberId),0) As MemberCount
From factMembership

[Code] ....

Admits dataset:

SELECT
Factadmissions.BusinessUnitCode,
factAdmissions.AdmitCCYYMM,
ISNULL(Count(AdmitNum),0)As [Count of Admits]
FROM factAdmissions

[Code] ...

View 6 Replies View Related

Analysis :: Partitioning Sets Of Data Dynamically

Jul 17, 2015

I have a situation where i have a transactional fact table which consists of date, row type, order number and value.  Simple example below

Date, RowType, OrderNo, Value
01-May, New, A1, 100
01-Jun, Change, A1, -10
01-Jul, Invoiced, A1, -90

What I need to be able to do is somehow select based on a day, the total value of open orders.  I have tried to do this in the database but it becomes fixed and quite cumbersome (this is a simplified example in reality i have line information and line component information).I am not hugely skilled with MDX and SSAS but know there are some semi-additive functions i want somebody to be able to pick a day and have the total value of only open orders.

View 2 Replies View Related

SQL Tools :: System Data Collection Sets

Jun 19, 2015

I created Data Collection in wrong DB, how can I change the DB or return to default(as it came with clean version of SS) ?

View 3 Replies View Related

Making A Chart Run With Multiple Data Sets.

Jul 26, 2007

Hello and thanks in advance.

I was wondering if anyone has ever written a chart with multiple datasets.

I need to be able to show sales dollars inflow by order date on one line and on the other needs to be sales dollars delivered by delivery date. So the all sections Values, Category groups, and Series Groups in the chart will be from 2 different datasets.

I have tried but it will not allow aggreates in the series groups.

Any Ideas would be greatly appreciated.

Thanks, Leo

View 1 Replies View Related

Copying Large Sets Of Data In Same Database

Mar 9, 2007

I need to copy data from TableA to TableB (>5 millions rows). The two are in the same database.

What is the best way of doing this?

I was thinking about using a simple INSERT INTO ... SELECT statement. Is there a faster way to do it with SSIS?

Thanks

View 13 Replies View Related

Adding A Calculated Field By Using Two Fields From Different Data Sets

Oct 1, 2007



Can I make a calculated field by using two fields from different data sets?(I'm talking about SSRS data sets)

I tried to do that. But I got a error message.




"Report item expressions can only refer to other report items within the same grouping scope or a containing grouping scope."


Please can some one help me out?

View 5 Replies View Related

Data Binding To A Stored Procedure That Returns Two Result Sets

Mar 6, 2007

Hi there everyone.  I have a stored procedure called “PagingTableâ€? that I use for performing searches and specifying how many results to show per ‘page’ and which page I want to see.  This allows me to do my paging on the server-side (the database tier) and only the results that actually get shown on the webpage fly across from my database server to my web server.  The code might look something like this:
 
strSQL = "EXECUTE PagingTable " & _
"@ItemsPerPage = 10, " & _
"@CurrentPage = " & CStr(intCurrentPage) & ", " & _
"@TableName = 'Products', " & _
"@UniqueColumn = 'ItemNumber', " & _
"@Columns = 'ItemNumber, Description, ListPrice, QtyOnHand', " & _
"@WhereClause = '" & strSQLWhere & "'"
 
The problem is the stored procedure actually returns two result sets.  The first result set contains information regarding the total number of results founds, the number of pages and the current page.  The second result set contains the data to be shown (the columns specified).  In ‘classic’ ASP I did this like this.
 
'Open the recordset
rsItems.Open strSQL, conn, 0, 1
 
'Get the values required for drawing the paging table
intCurrentPage = rsItems.Fields("CurrentPage").Value
intTotalPages = rsItems.Fields("TotalPages").Value
intTotalRows = rsItems.Fields("TotalRows").Value
 
'Advance to the next recordset
Set rsItems = rsItems.NextRecordset
 
I am trying to do this now in ASP.NET 2.0 using the datasource control and the repeater control.  Any idea how I can accomplish two things:
 
A) Bind the repeater control to the second resultset
B) Build a “pager� of some sort using the values from the first resultset

View 3 Replies View Related

SQL Server 2012 :: DB2 Store Procedure Returning Two Data Sets

Oct 13, 2014

A DB2 store procedure returns two data sets, when executed from SSMS, using linked server. Do we have any simple way to save the two data sets in two different tables ?

View 1 Replies View Related

SSAS Crashes - Mining Predictions For Large Data Sets

Sep 7, 2006

Hi all,

I am using SSAS 2005. The mining model works fine. But it crashes when I run the 'Mining Model Predictions' against large data sets.

I ran it against 5,000,000 records and it went fine.

But exactly same model failed for 5,100,000 records and beyound.

The message is 'Query Execution Failed' and then Visual Studio crashes.

Pl. let me know if anybody has the same experience or knows the solution.

Thanks,

Vikas

View 3 Replies View Related

BULK COPY - In Memory Data

Jul 11, 2007

Hi,

I have a set of records in application memory seperated by a record terminator ''. I can write the memory stream to a local disk file and call bcp api functions to load the file in to SQL server. But how do I transfer the in memory data directly to the SQL server, without writing to a data file, using ODBC. I am not using any .Net Framework classes in my code. The SQL server and application server(generating the data records) are on two different physical servers connected through network. I am trying to figureout the fastest and efficient way to load the data to SQL server from a remote application server. Thanks for your help.

Srini

View 1 Replies View Related

SQL In-Memory :: Table Memory Optimization Advisor Validation Passed But Cannot Migrate

Jul 13, 2015

I am looking to test this feature - and the "Transaction Performance Collector" has recommended me a table to port to In-Memory OLTP. 

I have now tried the "Table Memory Optimization Advisor" tool.

After a couple of tweaks to the table design - the tool is now passing validation but the tool is not allowing to progress to the next step:

Could it be down to not having enough memory? But would this not show in the advisor?

View 4 Replies View Related

Most Efficient DataSource?

Aug 3, 2007

Hi there,
 I'm using a Repeater at the moment which is bound to a SQLDataSource. I expect much load on that Website, should I choose another DataSource? Which other DataSource is better if it's about Performance?
I read some stuff about the SQLAdapter and a DataSet.. is that better in performance? Why is it better?
What about LinQ?
Thanks a lot for any clarification.

View 3 Replies View Related

More Efficient Code

Dec 13, 2007

Hi all, I have the code listed below and feel that it could be run much more efficiently.  I run this same code for attrib2, 3, description, etc for a total of 21, so on each postback I am running a total of 21 different connections, i have listed only 3 of them here for the general idea.  I run this same code for update and for insert, so 21 times for each of them as well.  In fact if someone is adding a customer, after they hit the new customer button, it first runs 21 inserts of blanks for each field, then runs 21 updates for anything they put in fields, on the same records.  This is running too slow...  any ideas on how I can combine these??  We have 21 different entries for EVERY customer.  The Pf_property does not change, it is 21 different set entries, the only one that changes is the Pf_Value.
Try                Dim queryString As String = "select Pf_Value from CustomerPOFlexField where [Pf_property] = 'Attrib1' and [Pf_CustomerNo] = @CustomerNo"                Dim connection As New SqlClient.SqlConnection("connectionstring")                Dim command As SqlClient.SqlCommand = New SqlClient.SqlCommand(queryString, connection)                command.Parameters.AddWithValue("@CustomerNo", DropDownlist1.SelectedValue)                Dim reader As SqlClient.SqlDataReader                command.Connection.Open()                reader = command.ExecuteReader                reader.Read()                TextBox2.Text = Convert.ToString(reader("Pf_Value"))                command.Connection.Close()            Catch ex As SystemException                Response.Write(ex.ToString)            End Try
            Try                Dim queryString As String = "select Pf_Value from CustomerPOFlexField where [Pf_property] = 'Attrib1Regex' and [Pf_CustomerNo] = @CustomerNo"                Dim connection As New SqlClient.SqlConnection("connectionstring")                Dim command As SqlClient.SqlCommand = New SqlClient.SqlCommand(queryString, connection)                command.Parameters.AddWithValue("@CustomerNo", DropDownlist1.SelectedValue)                Dim reader As SqlClient.SqlDataReader                command.Connection.Open()                reader = command.ExecuteReader                reader.Read()                TextBox5.Text = Convert.ToString(reader("Pf_Value"))                command.Connection.Close()            Catch ex As SystemException                Response.Write(ex.ToString)            End Try
            Try                Dim queryString As String = "select Pf_Value from CustomerPOFlexField where [Pf_property] = 'Attrib1ValMessage' and [Pf_CustomerNo] = @CustomerNo"                Dim connection As New SqlClient.SqlConnection("connectionstring")                Dim command As SqlClient.SqlCommand = New SqlClient.SqlCommand(queryString, connection)                command.Parameters.AddWithValue("@CustomerNo", DropDownlist1.SelectedValue)                Dim reader As SqlClient.SqlDataReader                command.Connection.Open()                reader = command.ExecuteReader                reader.Read()                TextBox6.Text = Convert.ToString(reader("Pf_Value"))                command.Connection.Close()            Catch ex As SystemException                Response.Write(ex.ToString)            End Try
 Thanks,
Randy

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved