Best Practice For This SP Scenario !

Oct 10, 2005

Hello All ..
This is the scenario I'm having :
-- I'm a beginner so bear the way I'm putting it ... sorry !

* I have a database with tables
- company: CompanyID, CompanyName
- Person: PersonID, PersonName, CompanyID (fk)
- Supplier: SupplierID, SupplierCode, SupplierName, CompanyID (fk)

In the Stored Procedures associated (insertCompany, insertPerson, insertSupplier), I want to check the existance of SupplierID .. which should be the 'Output' ...

There could be different ways to do it like:
1) - In the supplier stored procedure I can read the ID (SELECT) and :

if it exists (I save the existing SupplierID - to 'return' it at the end).
if it doesn't (I insert the Company, the Person and save the new SupplierID - to 'return' it at the end)
------------------------------------
2) - Other way is by doing multiple stored procedures,
. one SP that checks,
. another SP that do inserts
. and a main SP that calls the check SP and gets the values and base the results according to conditions (if - else)

3) it could be done (maybe) using Functions in SQL SERVER...

There should be some reasons why I need to go for one of the methods or another method !
I want to know the best practice for this scenario in terms of performance and other issues - consider a similar big scenario ..... !!!

I'll appreciate your help ...
Thanks in Advance . ! .

View 1 Replies


ADVERTISEMENT

Best Practice For Database Design Scenario

Sep 26, 2007

Hi,

I'm currently building a database that is going to have at least 6 different types of users accessing it via a web application.

A user will have different information collected about them. A few of of the users will have the same or similiar information collected about them.

I will be using role based authentication.

Now my query is this;

Scenario 1:
Would it be a good idea to keep all the users common information in one table. Ie. thier username,email,password,name. Then create a tblDetails for each type of user that would contain the different data. That way i can just check one table to verify thier login credentials.

OR
Scenario 2:
Would it be best to create a seperate table for each type of user and then log them in based on the credentials stored in each type of users respective table.

Hope this is clear. I'm leaning towards scenario 1, although I've used scenario 2 before. Just wondering which would be preferred.

Cheers
RobC

View 2 Replies View Related

Service Broker In Data Push Scenario:best Practice

May 7, 2007

Hallo I need some hints because I would like to set up my service broker in the proper way.

First: I'm going to setup service Broker between 2 databases on the same sql server instance.

My Goal: I insert the data on table1 in the DB1.On table1 there is a trigger that begin conversation and send the message to the service on the DB2.
On the receiving queue there is an Activation that take the xml message, shred it and save the content on th table 2 on the DB2.
Actually this SP is my main concern because the shred and insert involves also several checks so it could take "a while".

Volume of data: it seems that there are 100-200 daily insert in the table1 on the DB1 and it is possible that some of them arrives at the "same time".

Very quickly:
On the DB1 the trigger:

Notice that I reuse always the same conversation and the send queue as RETENTION = OFF


begin transaction;

begin

set @dialog_handle = (select conversation_handle from sys.conversation_endpoints where far_service='ReceiveService');



if @dialog_handle is null

BEGIN

BEGIN DIALOG CONVERSATION @dialog_handle

FROM SERVICE [SendService]

TO SERVICE 'ReceiveService'

ON CONTRACT [MainContract]

WITH ENCRYPTION = OFF;

END

SEND ON CONVERSATION @dialog_handle

MESSAGE TYPE Message ('uyiuy')

commit;

END


On the DB2
ReceiveQueue is defined as:

CREATE QUEUE [dbo].[ReceiveQueue] WITH STATUS = ON , RETENTION = OFF , ACTIVATION ( STATUS = ON , PROCEDURE_NAME = [dbo].[OnReceivedTrade] , MAX_QUEUE_READERS = 100 , EXECUTE AS N'dbo' ) ON [PRIMARY]



But most important is the Activation's SP and this is the main code:

while (1 = 1)

begin

begin transaction

-- Receive the next available message from the queue

WAITFOR (

RECEIVE top(1)
@message_type=message_type_id,

@message_body=message_body,

@dialog = conversation_handle
FROM ReceivedQueue

)
if (@@ROWCOUNT = 0)

BEGIN

Rollback Transaction

BREAK

END



SET @ErrorSave = @@ERROR ;

IF (@ErrorSave <> 0)

BEGIN

ROLLBACK TRANSACTION ;

SET @ErrorDesc = N'An error has occurred.' ;

END CONVERSATION @dialog

WITH ERROR = @ErrorSave DESCRIPTION = @ErrorDesc ;

INSERT INTO [dbo].[tblLog] VALUES(@ErrorDesc,NULL)

END

ELSE



BEGIN

'this is the SP that does most of the work

exec [dbo].[sp_ShredXMLMessageToRelationalData] @message_body

END

commit transaction



I have to say that everything works fine but I don't think that it is completly performant.
I read several blogs/forums and now I'm confused on the best way to implement service broker messaging:
These are my questions:

1) I read that is the DB's are in the same instance It can improve performances the send the message directly on the Queue2.
Can be useful in my case? How to implement it?
2)In my case it is necessarly that I send a message back to the initiator?
3) In my case the conversation never end. I don't think that this is correct but in case of the data push who has to END the conversation? The target? the initiator? never end to avoid overhead (I read that big overhead is caused by the BEGIN and END DIALOG)?

3) where (initiator or Target) and how to handle the service broker error? In my case my applications are SPs and I need to inform the developer or DBA that something went wrong during the processing ( conversation or shredding stored procedure).

4) In my case is should be a good idea to think about how to solve the possibility of the "poison messages"



for sure I will find out other questions...
Any hint, link is appreciated!

Thank you very much!

Marina B.

View 17 Replies View Related

Scenario

Apr 23, 2004

Consider the following scenario:

I got 2 webservers (2003) using NLB.
I want some kind of common data store (SQL Server).
I need this data store to be fault tolerant.
I don't have any more machines.
I don't have an enterprise licence for windows.

How do I solve this???

View 3 Replies View Related

Is This Scenario Possible?

May 4, 2007

I have a company with 5 offices connected with a dedicated line (500+ employees).

They have five databases, one in each office.

Now they want a new application which will replace the old (VB6 on MySQL will now be VB.NET 2.0 with MS SQL Server 2005).

The want all the data availiable in all the offices. We will still have 5 databases in 5 offices, but all with the same data.

Sometimes the dedicated line is lost.

Now I have a few questions:

Will the database be still running while a line is down?
Will the database be still running if replication is of type Merge (instead of continuesly).
Is it good practice to use unique identifiers versus Numeric keys?
Is it good practice when the application uses the "master" database to save global things like "Suppliers" and the local database to add and change Orders?
Do you have some recommended reads?

Thx for your time!




Henri
~~~~
There's no place like 127.0.0.1

View 3 Replies View Related

DR Scenario

Aug 13, 2007

For High Protection mode (2servers, no witness, FULL safety), in the event of a catastrophic failure of the principal, can a remote (but well connected) mirror be forced to assume the principal role? The MS manual "Database Mirroring in SQL Server 2005" on p2 says the failover type for HP mode is manual only (as opposed to forced). Elsewhere it seems to seems suggest otherwise but there is no clear and unambiguous discussion describing the exact procedure.

Can anyone clarify this matter? Thanks,


Bud

View 3 Replies View Related

SSB Scenario, Can It Be Done?

Oct 29, 2006

I currently have a workflow "like" application which uses msmq to pass messages to each step transactionally. I'd like to take advantage to something like SSB to make the app scale. Here's how it works today:

Start Workflow WebService puts message on queue.
Windows Service has threaded listeners on each queue for the workflow.
The listeners do a transactional Receive off the queue. By doing this is in a transaction, we can enforce that only one listener will pick up and process the message. Once they have a message they call a .Net Assembly to perform some work. Once the work is performed succesfully, the listeners put the work context on the next queue using the same transaction as the receive and commit the transaction effectively removing the message from the receive queue permanently. If a fatal error occured during processing, the work context is moved to an error queue transactionaly. If the process blew up due to other reasons, the transaction is rolled back and the message reappears on the original queue so that another instance of the service can pick it up. Currently all this is running on one instance of the Windows service and cannot span more than one host machine since MSMQ did not support transactional reads off public queues without using DTC.

I saw the external activator sample and it looks great but I have a few questions. First, I'd have to modify it to call assemblies instead of external processes. No biggie there. One other requirement that one and only one receiver should process the message and that when the next message is sent to the next queue, it's done within the same transaction as the receive. I hope that this can be done without any deadlocks. I also saw mentioned somewhere that this messaging could advantage of the local SQLExpress instances so that even if the main SQL Server instance is down, the messages can go out. Last requirement is that our message is a blob (serialized .Net object) so how do we define that kind of message?

Thanks

Costas

View 6 Replies View Related

Newbie Scenario

Dec 9, 2005

Table1: WriterID, WriterName, WriterSurname
Table2: BookID, WriterID, BookName, Category
Table3: CategoryID, CategoryName
There will be one form. Which way i should use to enter data from one form to multiple tables with same WriterID.
1.) Should i use SqlCommand class and a Stored Procedure?
( Like that: http://www.aspnet101.com/aspnet101/tutorials.aspx?id=13 )
2.) Is there another way without a stored procedure? I can't imagine how to insert same form to multiple tables. In real scenario there are lots of table and all contains a column that holds same value such as WriterID.

View 1 Replies View Related

My Synchronization Scenario

Sep 22, 2005

Hi, I want to know the a solution for my Synchronization Scenario

I have a several client databses which are SQL Server 2005 Express and i have a master database which is SQL Server 2000 containing all the individual Client databases. All the individual client databases are kept seperately at the master location. I need to Synchronization the client database with its copy at the master database (something like Merge replication). Both the Client Copy as well as Master Copy could be Publishers & Subscribers.

Now the problem is Because of security & firewall issues, only the Client should have the ability to schedule & initiate the synchronization process with the master copy. Unfortunately SQL Server 2005 Express has only subscriber agent and not a publisher agent.

Any help on how to achieve this would be appreciated . Thank You

View 1 Replies View Related

CASE Scenario

May 26, 2008

How can I be able to use "NewReceivedTime" as a variable?
Since having to create a new column using CASE statement in SQL would mean that user will not be able to use this new column name and having to receive error such as "Invalid Column Name: NewReceivedTime "

(case
when <value>
else <value>
end) as NewReceivedTime

I'm asking this because, I would want to use "NewReceivedTime" I've created to equate to another Time Column like
NewReceivedTime = LogDateTime something like that.

Thanks.

View 1 Replies View Related

How To Implement This Scenario

May 11, 2006

Hi,

i hav a text file in this format..
currency,exchangerate(INR),date
dollar,45,20/04/2006
dollar,46,22/04/2006
britishpound,65,20/04/06
dirham,12,20/04/06..etc..

now,i want that using this as source.. 2 tables should be created and filled with appropriate data..
CurrencyMaster..Currencyid(PK),Currencyname
CurrencyDailyRate..ID,Currencyid(FK),rate,date

how can i do it using SSIS?
thks

View 1 Replies View Related

SQL Express Scenario.. Is This Possible?

May 16, 2007

What I'd like to do is use SQL Express as a back-end for an existing Access 2003 application. The application is currently a single-file Access solution. I'm just investigating options to separate the front-end from the back-end so that the users can access a single data store from multiple PCs. We're only talking about maybe 5 users total, and really no more than 2 accessing it simultaneously at any given time (although the same setup exists in several offices). Right now they are just opening the Access db from a fileshare. This has resulted in corruption on a few occasions, which is part of the reason for wanting to replace the current solution with something that will be a little more robust. I'm wondering if there is a way to deploy a SQL Express db on a fileshare so that it can be connected to by the Access front-end. While we can install things on their desktops, we can't install anything on the file server, we can just put files there. Is there any way that I could make that solution work, or should I just stick with separating the Access front-end and back-end?

Thanks,
Kris

View 1 Replies View Related

Replication Scenario

Dec 17, 2006

Hi all,

I have a huge replication task I need to perform. The source table has over 250,000,000 records and everyday approximately 400,000 records are added to the system regularly.

Currently, I am running snapshot replication and it's taking 10 to 11 hours to complete (The internet connection between the production and the report server is slow). The reason I am using this method is because the source table does not have a timestamp column (so I can run an incremental replication) and I cannot modify that table because it belongs to a third party software. It does have a field which is linked to another table holding the timestamp.

Here is the source table and the other table's structure for reference:

DataLog

Name Datatype Size

DataLogStampID int 4
Value float 8
QuantityID smallint 2



DataLogStamp

ID (Which is foreign key to DataLogStampID field in the above table) int 4

SourceID smallint 2

TimestampSoruceLT datatime 8



What I would like to know is, what is the best practice to handle such replication with MSSQL Server 2000.

I am thinking of developing a procedure which follows the following step using a temp table:

This procedure will run (say) every night.

The temp table will contain a timestamp field.

For one time only, I will run a snapshot replication, so that I can have the existing data into the destination db.

I will record this timestamp into a repl_status table which holds the timestamp of the last replication.

The procedure will read the record from repl_status and select all records from the source table which were added since this date and put them into a new temp table (this table will have the same structure as the source and destination tables) and then a snapshot replication will "add" these new records only every night to the dest. table. By using this method, I will only transfer the records which have been added since the last replication (last night - less records).

Any comments would be greatly appreciated.

Thanks for your time in advance,

Sinan Topuz

View 8 Replies View Related

Unique Scenario

Jul 20, 2006

Hi,

I am faced with a scenario where I have to predict the time period that will
required to solve a particular problem depending on various factors like
Problem Type, Severity Level, Resource Availability etc.



I would like to know which algorithm is best suited for the
above scenario and the reasons for the same.

View 3 Replies View Related

Help With This Scenario On DB Mirroring

Dec 24, 2007

Hi,

I'm about to implement a DB Mirroring on my production server.and i'm wondering about the scenario what will happen if.
1. if my primary server suddenly die.how i can operate the mirror DB without losing data?
2. when the primary server comes to life, how i make the primary server to lead again with the DB Mirroring?(do i need to recreate all procedures of DB Mirroring again)
3. how can i be sure that the DB mirroring is indeed mirrored on the drp server?
4. what about alerting - can i be notified via email if the DB Mirroring isn't sync? or if the drp server is down and other issues that can be occur .

Thx

View 9 Replies View Related

Association Scenario

Aug 10, 2006

Hi,
I have a product basket scenario in where I have to recommend contracts to a customer based on the product and the quantity he/she buys. Product Quantity is an important factor which administers the user in the purchase of a particular contract
I have the following tables with me.
Customer product transaction table, Customer Contract transaction table but there is no direct relationship between contract and product in the database. The only way the two can be linked is through the customer.
If I create a mining structure with Customer-Product information as the nested table and Customer-Contract information as the nested table with customer being the link between the two, the model is showing some irreverent contract recommendations.
what is the solution for the above problem? Is it because the is no direct relationship between the product and the contract?
How can I overcome this problem?

View 6 Replies View Related

DR Test Scenario

Apr 25, 2007

Hello All



We have Asynchronous DB Mirroring established for our Production database

which runs on SQL Server 2005.



For the DR Test we plan to do the following during the period of low activity :



1. Pause the mirroring on the Principal Server.

2. Break the mirror on the Mirror Server

3. Take on offline backup on the Mirror.

4. Bring the Mirror Database up.

5. Run the DR Tests on the Mirrored Database

6. Restore the Offline Backup taken in step 3.

7. Reestablish the mirror from the Principal to Mirror.

8. Resume Mirroring on the Principal Server.

9. Verify Mirroring is happening on the Mirror Server



Can u please let me know if this plan is feasible and if there should be any modifications to the plan that are required. Any other suggestions/input is appreciated.



TIA



Best,

Jay



View 1 Replies View Related

Locking Scenario

Oct 3, 2007



we have a pretty vast t-sql statement that queries out production db and inserts results into a reporting table on the same db. the problem we're having is that during this process there are several locking processes engaged (key, table, schema, etc). i thought by moving this entire process to a seperate (reporting) database i could bypass this locking problem. however, the same problem exist in the reporting database. is there a way to disable all locking (this is just a reporting database so im not worried about the data integrity. any suggestions to disable all locking? by the way its a sql 2000 server.

example:
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
//Select cnttrtydayspast
insert into tampa_pmr ( hprop, sCategory, sSubCategory1, sSubCategory2, sSubCategory3, dtMonth, sMonth, sYear, iValue, iSort1, iSort2, iSort3, iSort4, iLevel)
(select d.hprop, d.sCategory, d.sSubCategory1, d.sSubCategory2, d.sSubCategory3, d.dtMonth, d.sMonth, d.sYear, d.iValue, d.iSort1, d.iSort2, d.iSort3, d.iSort4, d.iLevel
from
(select
p.hmy hProp,
'Avg Days For Outstanding Receivable' sCategory,
'Number of Households' SSUBCATEGORY1,
'z' SSUBCATEGORY2,
'z' SSUBCATEGORY3,
'#begMonth#' DTMONTH,
month('#begMonth#') SMONTH,
year('#begMonth#') SYEAR,
isnull(SUM(isnull(cnt.cntHmy,0)), 0) IVALUE,
9 ISORT1,
80 ISORT2,
0 ISORT3,
0 ISORT4,
2 ILEVEL
from property p
left join
(select amt.property_hmy phmy, count (distinct amt.thmy) cntHmy from (
select
p.hmy property_hmy
,t.hmyperson thmy
,tr.stotalamount - (sum(case sign(convert(INTEGER,DATEAdd(day, -1, DATEAdd(month, DATEDiff(month, 0, '#begMonth#')+1, 0))-trr.upostdate))when 1 then d.samount when 0 then d.samount else 0 end))
remain_amount
from property p
inner join trans tr on(p.hmy = tr.hprop and tr.hAccrualAcct is not null)
inner join tenant t on(t.hmyperson = tr.hperson and tr.upostdate < DATEAdd(day, -1, DATEAdd(month, DATEDiff(month, 0, '#begMonth#')+1, 0)))
left outer join detail d on (d.hchkorchg = tr.hmy )

left outer join trans trr on (trr.hmy = d.hinvorrec )
where
tr.itype = 7

and datediff("dd",tr.sdateoccurred,DATEAdd(day, -1, DATEAdd(month, DATEDiff(month, 0, '#begmonth#')+1, 0))) <=30
#Conditions#
group by
p.hmy
,t.hmyperson
,tr.hmy-700000000
,tr.stotalamount
having tr.stotalamount <> sum(case sign(convert(INTEGER,DATEAdd(day, -1, DATEAdd(month, DATEDiff(month, 0, '#begMonth#')+1, 0))-trr.upostdate)) when 1 then d.samount when 0 then d.samount else 0 end )
AND SIGN (tr.stotalamount - (sum(case sign(convert(INTEGER,DATEAdd(day, -1, DATEAdd(month, DATEDiff(month, 0, '#begmonth#')+1, 0))-trr.upostdate))when 1 then d.samount when 0 then d.samount else 0 end)) - ISNULL(( SELECT /* available prepays */
sum(isnull(d.samount,0)) tran_amountpaid
from property p
inner join tenant t1 on t1.hproperty = p.hmy
inner join unit u on u.hmy = t1.hUnit
inner join trans tr on tr.hperson = t1.hmyperson and tr.hmy between 600000000 and 699999999
and tr.upostDate <= '#begmonth#'
inner join detail d on d.hinvorrec = tr.hmy
inner join acct a on d.hacct = a.hmy and d.hAcct IS NOT NULL
left outer join trans trc on trc.hmy = d.hchkorchg
Where 1=1
and exists (select detail.hmy from detail
left outer join trans trc on trc.hmy = hchkorchg
where hinvorrec = tr.hmy
and (hchkorchg is null or trc.upostdate > '#begmonth#'))
and (d.hchkorchg is null or trc.upostdate > '#begmonth#')
AND T1.HMYPERSON = T.HMYPERSON),0)) = 1
) amt
where amt.thmy not in
(
select
isnull(t.hmyperson,0) thmy
from property p
inner join trans tr on(p.hmy = tr.hprop and tr.hAccrualAcct is not null)
inner join tenant t on(t.hmyperson = tr.hperson and tr.upostdate < DATEAdd(day, -1, DATEAdd(month, DATEDiff(month, 0, '#begMonth#')+1, 0)))
left outer join detail d on (d.hchkorchg = tr.hmy )

left outer join trans trr on (trr.hmy = d.hinvorrec )
where
tr.itype = 7

and datediff("dd",tr.sdateoccurred,DATEAdd(day, -1, DATEAdd(month, DATEDiff(month, 0, '#begmonth#')+1, 0))) > 30
#Conditions#
group by
p.hmy
,t.hmyperson
,tr.hmy-700000000
,tr.stotalamount
having tr.stotalamount <> sum(case sign(convert(INTEGER,DATEAdd(day, -1, DATEAdd(month, DATEDiff(month, 0, '#begMonth#')+1, 0))-trr.upostdate)) when 1 then d.samount when 0 then d.samount else 0 end )
AND SIGN (tr.stotalamount - (sum(case sign(convert(INTEGER,DATEAdd(day, -1, DATEAdd(month, DATEDiff(month, 0, '#begmonth#')+1, 0))-trr.upostdate))when 1 then d.samount when 0 then d.samount else 0 end)) - ISNULL(( SELECT /* available prepays */
sum(isnull(d.samount,0)) tran_amountpaid
from property p
inner join tenant t1 on t1.hproperty = p.hmy
inner join unit u on u.hmy = t1.hUnit
inner join trans tr on tr.hperson = t1.hmyperson and tr.hmy between 600000000 and 699999999
and tr.upostDate <= '#begmonth#'
inner join detail d on d.hinvorrec = tr.hmy
inner join acct a on d.hacct = a.hmy and d.hAcct IS NOT NULL
left outer join trans trc on trc.hmy = d.hchkorchg
Where 1=1
and exists (select detail.hmy from detail
left outer join trans trc on trc.hmy = hchkorchg
where hinvorrec = tr.hmy
and (hchkorchg is null or trc.upostdate > '#begmonth#'))
and (d.hchkorchg is null or trc.upostdate > '#begmonth#')
AND T1.HMYPERSON = T.HMYPERSON),0)) = 1
)
and amt.remain_amount > 0
group by amt.property_hmy
)cnt on cnt.pHmy = p.hmy
where 1=1
and p.itype = 3
#Conditions#
Group By p.hmy
) d
where 1=1)
//End cnttrtydayspast

View 1 Replies View Related

Which Version For The Following Scenario?

Jun 5, 2007

Hi all,



We are about to start a project that will involve three websites, each of which will have its own public domain name and two of which will act as e-commerce applications. Both e-commerce applications will only contain relatively small amounts of data (few thousand products, up to 15000 customer records with basic order history) and will be synchronised with a central data server at another site that will be responsible for handling payments, order processing, invoicing, etc.



The sites will take approx 500 hits every 24 hours and we will need to manage the DNS on the hosting server.



My question is, would we be ok to host these sites on a Web Edition of Windows 2003 Server with SQL Express, or is there any clear reason that we should go with the Standard editions of Windows Server and/or SQL?



Regards,



Stephen.

View 1 Replies View Related

Thanks Ray... &#39;corrupt Log Scenario&#39; Any Help Appreciated

Aug 26, 2002

Server: Msg 9004, Level 21, State 1, Line 1
The log for database ' ' is corrupt.

Connection Broken


How do i continue to perform a successful restore if one of the logs is corrupt? do i perform dbcc?

View 2 Replies View Related

SQL Server Express Scenario

Jan 10, 2008

Hello !

Please, I am new on SQL Server,

1.It's possible following scenario ?

I want to implement an application with SQL Server Express installed on a PS as server. Other PCs - from network or from http network - could connect to this sse database (my intention is to open a port on the server for using the internet network).

If yes, please give me an example of adequate string connection to use in application.
If no, what amount from this scenario could be implemented ? (And also, if you can, an sample of connection string).


2. Do you know what are the limitations of using SQL Server Developer Edition (~50$ price) in this scenario ?

Thank you very much !

View 8 Replies View Related

How To Create An Index In This Scenario?

Mar 20, 2008

I have a table as follows:

PID (PK), int
Postcode, nchar
Address1, nvarchar
Address2, nvarchar
Address3, nvarchar
Address4, nvarchar
Address5, nvarchar

I have a view on the table. The Output of the view is PID, Postcode, ADDRESS (which is Address1 - Address5 concatenated into one string with spaces between notnull values)

I use a sproc to search the view for an address based on a free-text search. Sproc as below. Can anyone advise how I should index the table? I have never created an index before and need to speed up the search, which at the moment is simply to slow. I have 1.8 million records in the table.

Here is the sproc:

SELECT PID, Upper(Postcode) as 'Postcode', ADDRESS
FROM vw_Postcode

WHERE
Address Like '%' + REPLACE(REPLACE(@address,',','%'),' ','%') + '%'
OR Postcode Like '%' + REPLACE(REPLACE(@address,',','%'),' ','%')+ '%'

Regards MArco

View 3 Replies View Related

How Does Replicaiton Work In This Scenario...please Help.

Dec 13, 2005

Hello all,
I am a newbie when it comes to replication. I have read the BOL but I was overwhilemed with the amount of information about replication.
We are planning to have a server with sql server 2005 and disconnected laptops with sql server express. How would replication work in this scenario assuming we are using merge replication:
Table AA [ a replicated table on the server and 9 columns]

View 3 Replies View Related

IS IT A POSSIBLE SCENARIO WITH EXPRESS EDITION ??

Jul 10, 2006

Hi,

I'm new to the express edition and not that expert about SQL Server generally, so I want to get the answer to this qquestion in details - if possible - and as simple as it coule be . Is the following scenario acheivable using MS SQL Server XE:

One server running WindowsXP+MS SQL Server Express Edition, about 10 users with an application to connect to the db in server and work within it.

would this scenario be available using XE?

thanks in advance for replying ,

moemen ahmed

View 1 Replies View Related

Synchronous Application Scenario

Oct 1, 2007

I'm new to SSB, so please bear with me. Our application requirements are:
1) Web app gathers user input from a web UI.
2) Web app calls a stored procedure, passing in the user input gathered in step (1).
3) Procedure issues queries to multiple data sources (SQL Server 2005 db's) derived from the user input.
4) Procedure waits for replies from these multiple data sources, governed by a timeout in case a data source is unavailable or slow to respond.

5) Procedure aggregates the data returned in step (4) from multiple data sources into an XML document.
6) Procedure returns the XML document as an output parameter.

This is different than the usual SSB asynchronous application paradigm. In particular, I'm wondering:

How can I setup a synchronous dialog, where the procedure that issues the SEND waits for a reply from the target service? I don't want the initiator procedure to end after SENDing, but rather wait for the replies to those messages so it can aggregate the data from the reply message bodies.

Thanks - Dana Reed

View 4 Replies View Related

SELECT QUERY SCENARIO

May 24, 2006



SELECT InventoryID,InventorySubTypeID,Make,Model,SerialNumber,OriginalCost,NetValue,InventoryStatusID, isnull(null ,0) [Adjustment Amount],isnull(null ,0) [New Net Value] ,null [Change Status To],null [Comments] from Inventory_Profile where InventoryID in (inventoryIds)

this scenario the inventoryids is a collection of ids like below

SELECT InventoryID,InventorySubTypeID,Make,Model,SerialNumber,OriginalCost,NetValue,InventoryStatusID, isnull(null ,0) [Adjustment Amount],isnull(null ,0) [New Net Value] ,null [Change Status To],null [Comments] from Inventory_Profile where InventoryID in (1,2,3,4,5)



this scenario i want to pass the ids at run time as a string in C#



Also how will i take the string in the DA class(C#) to execute it



guide me please

View 4 Replies View Related

Question Regarding Scenario On DataMining

Jan 22, 2007

Hi to all

I want to know how achieve the following scenario, I will highly encourage if anyone knows the solution please mention it through steps. The scenario is I have a table named €œCreditCardTable €? with following schema , the table is pretty much what one expect from a simple credit card application that will present form to user who fills the application of credit card and data going to be filled into the table. Later an analyst will review the data and will be accepting or rejecting the Credit Card along with some other fields as seen in the schema.

-----Schema --------
CustomerName Bigint
Age int
DOB DateTime
LimitAssigned Bigint
Analystname varchar
Salary varchar
Gender bit
Qualification varchar
Country varchar
JobTitle varchar
MartialStatus varchar
IsApproved bit
IsDeclined bit

Now what I want is that by using Microsoft Detaining Features after giving it the data t should purpose analyst the solution i-e (reject / approve ) the credit card application . Now I gather than there are many algorithms in SQL Server 2005 Analysis Services that can be used (Decision Trees , Association Rules ) , I research on the two I just mentioned and want to use Decision Tree for this purpose , but there is a bottleneck that is mentioned in a book named €œDetaining with SQL Server 2005 by Wiely€?. It says that the rules that the Decision Tree will build and use are not available to end user in contrast to Association Rule algorithm which provide the rules. I essentially need the rules by which the decision has been made. Please I really need help on these issue to summarize I need the following

1) With the table €œCreditCardTable€? I need to perform Decision Trees and Association Rules Algorithm on it , please provide me steps to do so , moreover please also mention how to specify Query (DMX I think) to get the final result i-e approve or reject based on the data .

2) The rules on basis of which the Algorithm makes it decision, please provide insights on how to achieve this by both Decision Trees and Association Rules Algorithm.

I really need an urgent reply on this , you can also email me at razi_rais@yahoo.com , thanks for your feedback.

View 6 Replies View Related

MSDE Replication Scenario

Aug 23, 2005

Greetings to fellow fleetfingers:

View 1 Replies View Related

Feasibility Of Filtering In This Scenario...

Jan 16, 2007

Background:
I'm a young developer working on a project that involves merge replication between SQL Server 2005 Mobile Edition and SQL Server 2005 and I've been having a hard time wrapping my head around exactly how to implement the best filters for the subscription(s) and publication(s).

Users of our application are Auction goers who collect data about AuctionItems in a mobile db and sync that information with a central db once he or she is finished. The central db is, obviously, pre-populated with most of the information about any/all AuctionItems. Central db information is available to view/change via web access.

Multiple users can change the same Auction data so there will be overlapping partitions.

What I would like to do is:
Present the user with a list of all available Auctions for the next 2 weeks (select * from Auctions where blah blah blah). This is a simple publication to create. Now, I'd like to be able allow the user to select AuctionID 4, AuctionID 3, AuctionID 12 and then, via a seperate AuctionListing publication only download AuctionItems that apply to those IDs.

Obviously, the publication has to be created and include the AuctionItem table and all necessary related tables/columns. But how do I create a parameterized filter based on that AuctionID?

Can a subscription be "dynamically created" once those AuctionIDs are known? Obviously, I don't want to bog my mobile devices down with tens of thousands of auction listings to auctions the user is not planning to attend.

Basically, my question is: am I wasting my time or is there some clever manipulation of HOST_NAME(), SUSER_NAME(), both, or some other method that I've missed that can get only those Auctions where the the ID matches one selected by the user at runtime?

View 4 Replies View Related

Unusual Custom Datatable Scenario

Jul 20, 2007

For my website, I need to grab information from sources located on different servers. This will give me me various datasets that I need to combine into one, main dataset to then bind to my grid. Each dataset has the following fields:
Name || Blocked
I need the main dataset that is to be used on the grid to have the following structure:
Name || Blocked 1 || Blocked 2 || Blocked 3 || etc.......
The name field I can take from one of the datasets as the list of fields will be the same. However, I need to take the data from each of the blocked fields in the various datasets that are returned and then combine them to be the Blocked 1, etc columns that you can see above.
I'm making this post to ask for any advice on the best way to do this. Any info would be most appreciated.
Thanks,
James

View 11 Replies View Related

Table Upate/insert Scenario

Oct 24, 2007

Hi,i am using sql 2005 and this is what i am trying to do..i  have multiple table say tbl1,tbl2,tbl3 etc and each one of have columns say col1,col2,col3,col4,col5 etc... and table_total.Each of these table contain  same forignKey,say col2,  and gets populated different time.In table_total i need to store the sum of particular column/columns(like sum of col3 from  tbl1  and sum of col3 and also sum of col4 from tbl2 )values in a given row,for a given foriegn key under the respective column in table_total. Something like table_total has columns sum_col3tbl1, sum_col4tbl2, sum_col4tbl3, sum_col5tbl2 likewise..table_toal should get updated at the same time,whenever tbl1,tbl2,tbl3 etc..are getting updated,so that it always has most updated value.Can someone suggest me how to achive this? thanks...

View 2 Replies View Related

Can Anyone Help Me To Write A Stored Procedure For The Following Scenario...

Feb 7, 2008

Hai friends, 
The project i'm working on is an asp.net project with SQL Server 2000 as the database tool.
I've a listbox (multiline selection) in my form, whose selected values will be concatenated and sent to the server as comma separated numbers (Fro ex: 34,67,23,60).Also, the column in the table at the back end contains comma separated numbers (For ex: 1,3,7,34,23).
What i need to do is -
1. Select the rows in the table where the latter example has atleast one value of the former example (In the above ex: 34 and 23).
2. Check the text value of these numbers in another table (master table) and populate the text value of these numbers in a comma separated format in a grid in the front end.
 I've programmed a procedure for this. but it takes more than 2 minutes to return the result. Also the table has over 20,000 rows and performance should be kept in mind.
Suggessions on using the front-end (asp.net 2.0) concepts would also be a good help.
 Anybody's helping thought would be greatly appreciated...
Thanks lot...

View 2 Replies View Related

CURSOR - Sqlserver - Good Example And Which Scenario We Wil Use

Feb 26, 2008

hi
Do you have any good example of CURSOR with sqlserver. When we should use?
Any live example will be appreciated, also how to call Cursor from .net Web Application
pls help to learn
 
 

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved