Thank you in advance for any advice that is provided from by the Dev Shed Users.
I'm on a development team that has been having an ongoing discussion/argument about the best way to handle our users needs while in Europe.
We are developing a Purchase Order application in VB.net using MSSQL Server 2000. We have about 5 out of 10 users that take a six-week trip to Europe. While in Europe the users will need to use the application. However, there are some cities they visit where the network connection will be slim to none.
The ongoing argument is as follows:
-Should we create a server running SQL Server 2000 for them to take Europe and sync up the two databases when they return?
-Should we create a version of the application running MS Access?
-Should we create a version of the application running MySQL?
-Should we do something completely different that we haven't thought of?
-Also, I'm not sure if the following is a possible architecture, from what I've found online I haven't seen an architecture that runs both SQL Server and Access (hope this makes sense):
Build the VB.net application running a shell of the database in Access (locally) that temporarily houses the data and tracks the database transactions made while using the system. Then upon closing the application a ?module? would execute that would perform the transactions made by the user on SQL Server. Then we could dump the SQL Server Database to an Access database before they go to Europe, and run their changes on the SQL Server Database when they return. Users at home will not be making changes to the same data as those in Europe.
We have a Silverlight based application which currently supports only one production version. Idea is to support three concurrent versions of the same application and user will switch to the newer versions based on their interest or they can still continue with the older version.
We still have to use the existing database for all these three versions.
What is the best way to architect this so that we can differentiate the code between the versions and still keep the data in sync and run all the versions in parallel.
I have a Web application in SharePoint 2013 which is HTTPS. I have powerpivot installed and it is working fine when I try to access the PowerPivot Gallery with the port number.For e.g. URL...Everything is working fine.
But when I try to accesss the same URL...without port number I am getting the error. something went wrong Could not load
Hi all, first of all I shoud mention it that I have posted a similar question in Oracle forum and recieve enough oracle-related feedback and solution, here (http://www.dbforums.com/showthread.php?t=1609238).
Now I want to know if LIKE searches are awful (performance wise) in SQL Server too, do you recommand any solution to enhance it by some indexing twick? I want to avoid using MS index service (if I spell it correctly) as far as I could ;)
Note: We have a lot of LIKE '%abc%' queries at this system.
I was wondering if someone could help explaining how the SQL Server handles all incoming connections, I and a friend started a discussion about using the connection pool in ADO.NET or not in a specific case.
Usually when creating like an ASP.NET Page it's recommended to use the connection pool, because it optimizes the performance, by using already created connection, so we don't have to recreate the connection all the time. I have always thought that the connection in the connection pool had an open connection on the server. So 5 connections in the connection pool, would be 5 open connections at the server. But after having the discussion with my friend I am not so sure anymore.
Say that I would create a client application (.NET using ADO.NET) that connects to the database and work with that data. If I then have 1.000 clients and each client have a connection pool with 5 connections in it (I think that the default numbers for the connection pool), then there would be like 5.000 open connections on the SQL Server, where most of the connections actually never do anything more than hanging around and waiting... And then the connectionpool is not 1 per client but 1 per connectionstring.
So if my client scenario would access data from 2 different databases, there would actually be 10.000 open connections at the SQL Server. So now I think that there must be a server connection pool or something to handle the connection from the clients. So that there would only be like 10-50 open connections at the server for the 1.000 Clients that was connected.
How is it? Is there 1 open connection in the server for every connection in the connection pool? If that's the case, it would be better for the SQL Server if I don't not use the connection pool in the client, but instead open the connection when we need it, and then close it, and taking that little performance hit every time, to help the performance on the SQL Server.
I am just starting with Sql Server reporting, but I can't get interactive sort to work for columns with aggregate fields.
I am using the following query for DataSet:
SELECT Store.ID as StoreID, Store.Name as StoreName, COUNT(*) as NumReservations, SUM(Appointment.TotalBeforeTaxes) as Revenue FROM Store LEFT JOIN Appointment ON Store.ID=Appointment.StoreID GROUP BY ALL Store.ID, Store.Name ORDER BY Store.Name ASC
For report, I am using tabular data view. Interactive sorting works great for StoreID, StoreName, but doesn't work for NumReservations and Revenue fields. I turned it on for all 4 columns.
Hi, I have few views in SQL Server 2005. In Design View, the results of View are ok. In OPEN VIEW option, records are not sorted correctly, ORDER BY is ignored. What could be the reason for this ? Thanks a lot in advance!
Interested in feedback from the SQL grand wizards (and would-be wizards) that haunt these forums.
Let's say you need to constantly stream data into an OLTP system. We are talking multiple level hierarchies totaling upwards of 300 MB a day spread out not unlike a typical human sleep cycle (lower data during off-peak, still 24/7 requirements). All data originates from virtual machines running proprietary algorithms. The VM/data capture infrastructure needs to be massively scalable, meaning that incoming data is going to become more and more frequent and involve many different flat record formats.
The data has tremendous value when viewed both historically as well as in real-time (95% of real-time access will be read-only). The database infrastructure is in it's infancy now and I'm trying to develop a growth plan that can meet the needs of the business as the data requirements grow. I have no doubt that the system will need to work with multiple terabytes of data within a year.
Current database environment is a single server composed of a Dell PowerEdge 2950 (Intel Quad Core 5355, 16 GB RAM, 2 x 73 GB 15K RPM SAS ) with an attached Dell PowerVault MD1000 (15 x 300 GB 10K RPM SAS in RAID 5+0 [2x7] w/hot spare) running Win 2k3 64-bit and SQL Server 2005 x64 Standard, 1-CPU.
I am interested in answering the following questions:
Based on the scaling requirements of the data capture and subsequent ETL, what transmission method would you find most favorable? For instance, we are weighing direct database writes via stored procedures for all VM systems versus establishing processes to collect, aggregate and stream CSVs into a specialized ETL environment running SSIS packages that load data and then call SQL Stored procedures to scrub and prepare for production import. The data will require scrub routines that need access to current production data, so distributing the core data structures to multiple ETL processing systems would be expensive and undesireable. Cost is very important to the overall solution design. In terms of database infrastructure, how would you maximize business value while keeping cost as low as possible? For instance, do you think there is more value in an ACTIVE/ACTIVE cluster (2 x CPU licenses) where one system acts as ETL and the other as OLTP or would you favor replication of production data from ETL to OLTP or (vice-versa). With the second scenario, am I mistaken in thinking we could get away with a Server/CAL licensing model for the ETL server?. Are there any third party tools that I should research that would greatly aid me here?
I appreciate all feedback, criticism, and thoughts.
I am in the process of designing a database infrasture layout that can virtually scale to an very large number of servers in efforts to improve performance. The Scale-out architecture vs. grid computing (something like Oracle RAC) seems to be the way to go. It may take a lot more work up front, but it seems very flexible in the long run.
One of the issues that I am trying to tackle is how should I grow this thing. Right now, I have one single 4 way server running SQL 2005 Ent. edt. We are planning on getting a second server as well as a Enterprise level San solution.
With my 2 goals in mind (Scale out architecture and High Avail) should I bring this second server online as a passive cluster node, or should I partition out the data across both nodes. Will clustering even be part of my fault tolerence plan or should I use replication?
Its hard to find a good answer as what is the *best* way to make this happen.
Here is my situation: I am creating a database driven ASP.Net web application that will be used over the internet. My ASP.Net application connects to my SQL server 2005 database/server by using a SQL server login. I am using the DPAPI API to encrypt my connection stings with a hidden entropy value for extra security. I am using the SQL login for obvious reasons, as my users will not have a windows login.
What I am trying to do: I want to limit this SQL login account to be able to just run/execute stored procedures and NOT access the tables or views directly. In my ASP.Net application I am using the MS applications data block, and I am using stored procedures for every single database access action. There is no inline SQL being executed from my web application.
What I have tried so far:
I created a new schema and made the above SQL login account owner of this schema. I then granted "Execute" permissions to the SQL login and DENY permissions to all other permissions.
I created an database role with "Execute" only permissions and DENY permissions to all other permissions.
What Happened: In BOTH of the above scenarios I tested a direct SQL statement against one of my tables, from my ASP.Net application and I was able retrieve data back, NOT GOOD, exactly what I am trying to STOP.
If someone could give me (Step-by-Step) guide on how to setup the situation I am looking for, I would be very grateful!
I have a problem reading binary data in MSSQL using the Server Mgmt Studio. All it shows in the column is "<Binary data>". Is there a way to view this data at least the SIZE?
First off, I appreciate the time that those of you reading and responding to this request are offering. My quesiton is a theoretical and hopefully simple one, and yet I have been unable to find an answer to it on other searches or sources.
Here's the situation. I am working with SQL Server 2005 on a Windows Server 2003 machine. I have a series of databases, all of which are in Full recovery mode, using a backup device for the full database backups and a separate device for the log backups. The full backups are run every four days during non-business hours. The log backups are run every half hour.
Last week, one of my coworkers found that some rarely-used data was unavailable, and wanted to restore a database to a point in time where the data was available. He told me that point in time was some time back in November.
To accomplish this, I restored the database (in a separate database, as to not overwrite my production database) using the Point in Time Recovery option. I selected November from the "To a point in time" window (I should note that this window is always grey, never white like most active windows, it seems), and the full database backup and the subsequent logs all became available in the "Select the backup sets to restore" window.
I then tried a bevy of different options from the "Options" screen. However, every restore succeeds (ie: it doesn't error out), but seems to be bringing the database back to a current point in time. It's never actually going back to the point in time I specify.
My questions are as follows:
a) Is it possible to do a point in time recovery to a point in time BEFORE the last full database backup?
b) If so, what options would you recommend I use? (ie: "Overwrite the existing database", restore with recovery, etc etc).
I again appreciate any and all advice I receive, and I look forward to hearing from anyone and everyone on this topic. Thank you.
I would like some help creating a view that will display the latest status for each application. The lastest status should be based on CreateDt. For example: Table Structure: ============ Application: ApplicationID, Name, Address, City, State, Zip, etc.. ApplicationAction: ApplicationActionID, ApplicationID, Status (Ex:new, reviewed, approved, closed), CreateDt
View should display: ============== ApplicantID, ApplicantActionID, Status, CreateDt
I have created a simple package that uses a sql command to pull data from an oracle database and inserts the data into a sql 2005 table. Some of the data fields that i am pulling from contain two digits after the decimal point, however this data is lost when it gets into sql. I have even tried putting the data into a flat file, and still the data is lost.
In the package I have a ole db source connection which is the oracle database and when i do the preview i see all the data I need. I am very confused and tried a number of things to get the data into sql, but none work. Any ideas would be very helpful.
Hi everybody. I created an application role in a database (DB1) and gave it all the rights on a view in DB1 which refers to a table located in another db (DB2). I also gave the rights to the app role on a table of DB1 I tried to use this app. role through the sp_setapprole launched by a user (server principal?) which is SQL Server administrator (and local administrator (Win 2003 Server)). With the following query SELECT USER_NAME()
I see that the approle is being used. Than, if I query the table on DB1 everything works, but if I query the view, referring a table in db2 I get following error:
The server principal "NameOfServerPrincipal" is not able to access the database "DB2" under the current security context. What should I do to make it work?
The table in DB2 has the same schema of the view in DB1 which refers to it. I put the DB1 TrustWorthy and both the database have the db_chaining option activated.
Any idea on how to solve the problem would be widely appreciated. Thank you very much. Vania
Hello, i have a problem regarding stored procedures and view server state.
I have an application with a lot of stored procedures, one of them checks data of the connected users. In SQL 2000 i had no problem getting this information, but in SQL server 2005 i do.
my stored procedure looks like this:
ALTER PROCEDURE [dba].[applsp_GetConnectionInfo]
(
@DBName varchar(100)
)
WITH EXECUTE AS OWNER AS
BEGIN
SET NOCOUNT ON
DECLARE @sCollationMaster VARCHAR(128);
DECLARE @sSqlString VARCHAR(900);
-- Determine collation from master database because collation from master and ultimo database may differ
SELECT @sCollationMaster = CAST(databasepropertyex('master', 'Collation') AS VARCHAR);
SET @sSqlString =
'SELECT max(status) AS Status, max(isnull(SCISUSENAME, ''ULTIMOLOGIN'')) AS Login
, MAX(Rtrim(Rtrim(convert(varchar(255), nt_domain)) + nt_username)) AS NTUser
, max(Rtrim(hostname)) AS Host, MAX(Rtrim(program_name)) AS Program
FROM master.dbo.sysprocesses JOIN dba.SCONNECTIONINFO on SCISPID = CAST(spid AS VARCHAR)
AND ( SCISUSENAME = ISNULL(loginame, '''') COLLATE ' + @sCollationMaster + ' OR ISNULL(loginame, '''') = ''ULTIMOLOGIN'')
WHERE ...... AND DB_NAME(dbid) = ''' + @DBName + '''
GROUP BY hostprocess
ORDER BY Login
';
EXEC(@sSqlString);
END
I've granted view server state permissions to my user 'dba' which is the db_owner. When i execute the query in the stored procedure seperatly as dba i get all the info i need, but when i execute the stored procedure i don't see anything.
I seem to have the same problem with sp_who2 Executing it gives me information about everyone but when i put in a stored procedure like this:
Is MCDBA worth of the time and money one has to put into to acheive it? I am talking in terms of getting a job. Do you think after getting MCDBA you have gained sustantial knowledge about the subject?
I havent appeared for any of the MS exams as yet but would like to start with 70 228 and 70 229.
I like reading BOL, but i guess for certfn exams you have to have a diff approach...
What is the point of SQL Server 2005 Express ?On the Microsoft website (http://www.microsoft.com/sql/editions/express/default.mspx) it says:"SQL Server 2005 Express Edition is the next version of MSDE and is a free, easy-to-use, lightweight, and embeddable version of SQL Server 2005." When I asked my hosting company about using it, they said this:"SQL Express is a development platform so we cannot provide this in the production environment. We could however offer this as part of a dedicated service." For this they charge £15/month ($30). I am supposed to use Access for small website development??
I hope this is the right forum to post my question in.
Does anyone know if or when Microsoft will discontinue support for 2000 DTS Legacy packages within SQL Server 2005? We€™re in the process of migrating from 2000 to 2005 and have a lot of packages to migrate to Integration Services. We€™re having problems migrating the packages because quite a few of them are complex and won€™t run after migrating to IS without a major rewrite. Right now it seems that those packages will run just fine under the Legacy folder on the 2005 instance with the data connections pointed to 2005 databases. We€™d just like to plan appropriately if we absolutely have to migrate those packages to IS at some point soon. We realize that we'll need to create new packages using IS, though.
So I€™d appreciate it if anyone has heard anything to please let me know. I apologize if this has been asked before, but I couldn€™t seem to find any posts on it. Thanks!
Hi guys, I want to know whether Cold Fusion 6.1 supports on 64-bit server? My web application server is 32-bit with Cold Fusion 6.1 and my SQL server is running 64-bit (SQL Server 2005). Is there any problem if I set a data source to the SQL server ? Does anyone experience that before using Cold Fusion? Hope can get a reply here asap. Thanks alot.