unfortunaly i have two tables where usernames are stored. in the one table the column is called 'remote_U' in the other 'remote_u' ... note the upper case
so want to generate listof all usernames and retrieve them in a single a column.
like
SELECT DISTINCT remote_U FROM table1 APPEND (SELECT DISTINCT remote_u FROM table 2)
i have tried joins and groups, but is not working. can i also do an alphabetical order after appending the second select? distinct is needed, 'cos one user may have more then one entry in the table, but may be also in both tables ....
Hiya everyone,I have two tables in SQL 2000. I would like to append the contents ofTableA to TableB.Table A has around 1.1 Million Records.Table B has around 1 Million Reocords.Basically TableA has all of the data held in TableB plus 100,000additional records. I would only like to import or append these newadditional records. I have a unique index already setup on Table B.Any ideas pretty pretty please?Paul.Ps. (Have been messing around with DTS but get a unique violation error- Which is kinda what I want I guess, but would like SQL to ignore theerror and only copy the new data - if only)
I have numerous entries of same id name belonging to same median number.However,I want to only retain the entries having the longest first and end position and discard the remaining entries
E.g. for id name ="PSK_30s1207681L002" AND median = 5 we have four entries
I am very new to SQL coming from a SAS background.
I work at a College where I am trying to create a table that basically stores application data so that I can record how many applications we had on a set day.
For example I want to run it say bi-weekly and create a date/time stamp for each row and then the next time it runs it appends to the original table but with the new date/time stamp.
I have managed to create the table and the date/time stamp but I'm not sure how to do the next stage.
This is not the best practice as the table will get "big" quite quickly (although we're only dealing in the hundreds in terms of rows). Ideally I'd have a more transactional database that only appends a record if there is a change i.e. the status of the application has changed from "Received" to "Interview" or similar.
I'm not sure how to copy and paste the code dynamically in this forum but this is what I have got so far (see below).
SELECT CC.cc_name as Academy, SUBSTRING(CC2.cc_reference,1,4) as Cost_Centre, MD2.m_reference as Course, MD2.m_name as Course_Name, CAST(MD2.m_start AS DATE) as Start_Date,
I need help in generating a stored proc script (SQL Server 7.0)that will:
document the changes of each field in a dbase (SQL Server 7.0) appending the deltas (changes) to the dbase on a different, say production server for each affected field prompt the user (dba, developer, super user) for the fields to involve in the comparing and documenting of changes (deltas).
thanks in advance for your help!
ALSO: Does SQL Server 2000 handle this situation better than SQL Server 7.0
Dear All,I am Using MS SQL EXPRESS SERVER .I have installed all tools available to Express Edition site. Now I have created my database on this .I have imported a table from my MS ACCESS database (Using ODBC Datasource).This table contains 10,000 records ,Now I want to append 1 more access Table(5500 records) to the existing table having same fields.How to do this.Can any body tell me? Thanks and Regardsmukesh
We have a need to report on historical data when none exists in the database.
Create a tabular report rdl that is run on a regular schedule. This report run is saved as an Excel sheet that overwrites the previous run, which has the same name.
Report is designed to use two data sources--the database and and the previously run Excel sheet. The data is "merged" using the Lookup function thereby creating a new report that includes the history needed.
I created a ssis package which exports the data from oledb source to flat file (csv format). For this i have OLEDB source and Flat File as destination. I generate the file and filename dynamically with the column names in the first row. So if the dynamically generated file name already exists , then i want to append the data in the same existing file. But I dont want to append the column names again. I just want to append the rows to the existing rows.
so lets say first time i generate a file called File1_3132008.csv.
Col1, Col2 1,2 3,4
After some days if my ssis package generates the same file name i.e. File1_3132008.csv, this time i just want to append the rows to the existing file. So the file should look like this- Col1, Col21,23,45,67,8
But instead my file looks like this if i set Overwrite propery to false
Col1,Col2 1,2 3,4 Col1,Col2 5,6 7,8
Can anyone help me to get the file as shown in the highlighed
I need to append all the data from LIVE into DEV environment. I have tried using MS Access (linking tables & importing tables and running APPEND query to update the rows from LIVE to DEV but PK & FK is causing problems as some data will have the same ID's....
Is the merge method, what will work in this case ? I have two datatables with the exact same structure. How can I append the rows from table 2 onto the bottom of table 1 ? Is looping through the rows collection the only way ?
I am trying to append the current row ID to a string I am trying to insert via a sproc. I have retrieved the @@Identity and I am passing it into a class with a parameter and calling it using:
Listings.UpdateDB AddNewListing = new Listings.UpdateDB();
I know this must be simple, but I am stumpped, please help!
I am writing a stored procedure in SQL 2000 where an incomming variable is a string of characters (a couple of sentences) and I want to add that to the existing string of characters in a table field called "Comments".
I do not know how to append the text in a field. How is that best done?
The basic function of the procedure is to take whatever string is passed to it and append it to the current contents of the field "Comments". As the procedure is ran over and over again, the field is constantly appended with the incomming text.
What is the best way to do this? Can anyone give me an example?
I know this must be simple, but I am stumpted, please help!
I am writing a stored procedure in SQL 2000 where an incomming variable is a string of characters (a couple of sentences) and I want to add that to the existing string of characters in a table field called "Comments".
I do not know how to append the text in a field. How is that best done?
The basic function of the procedure is to take whatever string is passed to it and append it to the current contents of the field "Comments". As the procedure is ran over and over again, the field is constantly appended with the incomming text.
What is the best way to do this? Can anyone give me an example?
I have two tables that have the same column names, data type and length in each. The only difference is that one is the USA ( COUNTRY) and the other is International ( COUNTRY ). I want to make these two tables into one table. I don't think that a "UNION" will do that on a permanent basis. What other options do I have?
We're running SQL 6.5 SP3, we recycle our SQL server every day, somehow starting last December, the errorlog kept appending to the previous one without starting a new log, and it keeps on growing, any one knows anywhere I should look into ? If SQL behaves properly, it should starts a new log after each recycle. I checked from Technet this problem may occur in 4.2 but I haven't seen anything in 6.5....Thanks Anthony
I need to make sure I'm doing this correctly can you help me out guys please?? This is an Appending Stored procedure it should move values from the EmployeeGamingLicense table when the status is turned into TERMINATED to the GCEmployeeTerms table. Heres what I have so far, having problems with the rest of the script getting errors
AS INSERT INTO [CommissionEmployee_Exclusionsdb].[dbo].[GCEmployeeTerms] ( [TM #], [FirstName], [LastName], [SocialSecurityNumber], [DateHired], [Status], [TerminationDate], [Title], [DepartmentName], [TermReason], [VoluntaryInvoluntary])
SELECT ( @TM_#, @FirstName, @LastName, @SocialSecurityNumber, @DateHired, @Status, @TerminationDate, @Title, @DepartmentName, @TermReason, @VoluntaryInvoluntary) FROM EmployeeGamingLicense WHERE STATUS = 'TERMINATED' GO
DTS wizard is not allowing me to append the data to a text file. Every time I run DTS and choose the destination to be this text file (say A.txt), it overwrites the data. I have a table whose data I am dumping to a text file. I truncate the table, then get the data again into it and want to append it to the same text file. But I end up overwriting the text file with the new data.
Hello all! I have a dts package exporting a text file. I would like for the dts job to append to the end of the file each time it is ran, rather than overwriting it. Is there a simple solution for this?
When I create a db backup on our network using BACKUP DATABASE...
BACKUP DATABASE [TKKCommonData] TO DISK = N'G:SQL_BACKUPSTKKCommonDataTKKCommonData_DATA.bak' WITH NOFORMAT, NOINIT, NAME = N'TKKCommonData_DATA-Full Database Backup', SKIP, NOREWIND, NOUNLOAD, STATS = 10
I've specified the NOINIT so that it appends rather than overwrites the database, however the database is still overwritten. Any idea how to get the database to backup and append to the set rather than overwrite the backup ?
I have subscription records for five different magazine titles that i process by looping each one though a dataflow using a for-each loop. I am using the following instructions to append to the raw file: http://blogs.conchango.com/jamiethomson/archive/2005/12/01/2443.aspx
This works fine when I pass four different magazine titles. when i try to run all of the titles(five), i get the following errors:
[Raw File Destination [131195]] Warning: The parameter is incorrect. [DTS.Pipeline] Error: component "Raw File Destination" (131195) failed the pre-execute phase and returned error code 0x80070057.
Currently I have a maintenance plan running a Full backup weekly, differential backups nightly, and log backups hourly. The log backups are all going into a single backup file - but it's hard to see what's going on behind the scenes here.
Does this file get 'reset' when the full backup is performed? Will it just keep growing indefinitely and should I be creating new files for each log backup, or manually deleting the file each week during the full backup task?
(SQL 7 on NT Server) I want to append a table with 1200 rows, using DTS. While I know it is better to do while no one is using the database, exactly what impact will it have if I do this while the database is online? Which leads me to my next question: Exactly what operations can I do while the db is online, and what ones should I not even think of. Most of my needs are data imports and exports. I haven't found much in Online Books about this. Any help would be appreciated very much.
I would like to update a field that already has data in it and I dont' want to overwrite the existing text. Here is my existing statement
UPDATE wr SET cf_notes = " + tmp_array(24) + " WHERE wr_id = " + data_temp(0)
I would like to add cf_notes + tmp_array(24) to cf_notes. Is this possible in SQL? If so, what is the correct syntax. I have tried 6 different statements and I get a compile error on every statement.
Hey All, I am currently backing up a SQL 2000 Database by doing full backups everynight. I would like to cut this back to once a week and do differential backups during the week. BUT... How can I do differential backups without appending or overwriting the night befores backup? I'm sure this can be done through command line but im not exactly sure of the process. If anyone can help please post...
I am having a problem with a DTS package I am writing. The package is getting text from a table and putting it out to a text file. I have most of it sorted, and contains text like this: PN,GEN,T_KI-3,2006-01-24 00:30,480,2006-01-24 01:00,480 PN,GEN,T_KI-2,2006-01-24 00:40,484,2006-01-24 02:00,482 PN,GEN,T_KI-3,2006-01-24 00:50,490,2006-01-24 05:00,486 What I need is a line at the end of the text which is: <EOF>
So the output from this will look like: PN,GEN,T_KI-3,2006-01-24 00:30,480,2006-01-24 01:00,480 PN,GEN,T_KI-2,2006-01-24 00:40,484,2006-01-24 02:00,482 PN,GEN,T_KI-3,2006-01-24 00:50,490,2006-01-24 05:00,486 <EOF>
I am having problems adding the line at the end of the text file. Can anyone advice me of a good solution or show me where I amn going wrong?
Now I want to create usernames from #test1 by considering first character of first name and last name and if same combination found then append with 01.
Example if #test1 contains data as below:
1,Abhas, Pawar 2, Arun, Pawar 3, Ashis, Panday
Then i want to create username like:
apawar apawar01 apanday02
but if same username exists in #test2 then i want to inser records as below:
first apawar will check in #test2, if not exists insert as it is:
if apawar01 exists in #test2 then, cretae apawar02 instead of apawar01
for next create apawar03 and insert and so on...
In brief I want to check created username eith #table2 and if same exists then first check if lower value available if not then create with lower value and insert.