We have SharePoint list which has, say, two columns. Column A and Column B.
Column A can have three values - red, blue & green.
Column B can have four values - pen, marker, pencil & highlighter.
A typical view of list can be:
Column A - Column B red - pen red - pencil red - highlighter blue - marker blue - pencil green - pen green - highlighter red - pen blue - pencil blue - highlighter blue - pencil
We are looking to create a report from SharePoint List using SSRS which has following view:
red blue green pen 2 0 1 marker 0 1 0 pencil 1 3 0 highlighter 1 1 1
We tried Sum but not able to display in single row.
I need to report on data from several databases on several servers. They are all SQL Server 2005 databases. I am trying to created an Integration Services task to consolidate and transform this data for easy reporting. The problem I am having is one database in particular. It has tables like this:
I want to use only the tables of the form "tblLookupParseData*" for this list. I can do this in Stored Procedures by dynmacally building up the sql query. I cannot find out how to do this in Integration Services. When I make Datasource Views, they seem to expect me to pick from a list of known tables. This list of tables grows as Customers are added to the system.
NOTE: The way the tables are structured was NOT my idea. I hate storing "data" in the structure of the database. Many people also do this when they create "period" tables such as "CustomerData_05_2005". It speeds up writing the data, and querying a specific table, but it is a nightmare for reporting. I cannot change this as it is not in my responsibility.
I am writing my first custom SSIS task and I can see that, if I put a public property into the task, I see that property in the standard Properties window. If I add a property of type String, I can put a value in that property, in the task code, and when I instantiate the task in a package, I can see the value I entered.
To try and get a drop down list property in the Properties window, I declared a property of type Combobox, and indeed a drop down list appears in that property.
My problem is trying to get values in that property. I have used the test items:
I need to move specific files from a server to another server on a monthly basis. There are hundreds of files that are in the source directory and I need to move approximately 40 of those to the destination server. I would like to easily add or delete the file list as needed. I have seen where several variables were created for for each file name (and one for the path) and the ForEach Loop would go through them. With 40 or more I was thinking that I could make a connection to an Excel spreadsheet or text file with a record for each file name and read in and and move to the next record and make that value become the content of a "FileName" variable. Then if I wanted to add another file name I could just add another record to spreadsheet/text file or remove and the package would handle automatically....
I have a task for which I have to load csv file from a shared directory into sql table. Right now I'm stuck with a road blocker, The issue is the shared drive contains all the history files as well and I have to pick only the latest file. But I cannot identify latest file based on the file name because it doesn't contain any date in the file name. However by seeing file properties I can pull latest file.
Sample file name: XXX_XXX_XXX_XXX_XXX-5814201.csv
Is there any way we can automate this in SSIS with file properties and picking the latest one?
I was to split each record into multiple columns. The problem is some records need to be split into only 1 column, others may need to be split into more. Also need to remove the "/"'s. This is all dependent on where a "/" is found. Been beating my head for a while and getting nowhere.
So:
create table #foo (myPK int, c1 nvarchar(425)) insert into #foo values (1,'/folder1') insert into #foo values (2,'/lvl1/folder2') insert into #foo values (3,'/folder1/lvl2/folder3') insert into #foo values (4,'/f1/folder2/lvl3/fldr4')
I have an excel sheet containing one column (ID_NO) with 400K rows. I have a database from where I have to fetch some other columns from a Netezza database. Initially I tried hardcoding all the 400K rows in the query that I wrote using filter WHERE ID IN ('1212','2334'). But after pasting all the 400K rows the query is running indefinitely.
I have imported all the ID in a SQL table (MY_LIST table). I used a DFT, and selected ODBC source, and selected my netezza server. Then in the 'Data access mode' I selected the SQL command from the dropdown.I pasted the same query that I wrote in Netezza. Is there any way to pull only for those records that I have pulled in my SQL table (MY_LIST) ?
I have implemented a package to load multiple files to a destination. Since the source was a txt file, i have created as flat file source. However now we are getting files in excel format as well.
Is there anyway the source gets changed dynamically based on the file extension, output of the foreach file enumerator? I can think one solution to have 2 dataflow tasks based on precedence constraining and expression one is for .txt and other one is for .xls.
I will be receiving a CSV daily where columns within the file will change. The column order and number of columns can change daily. I need a way to read in the header from the csv and create a flat file connection that reflects the columns listed in the header.
Is there an easy way to do this using a script task? I have already read the header into a table but I have been unable to create the dynamic file connection.
I am new to SSIS and C#. In SQL Server 2008 I am importing data from a .csv file. Now I have the columns dynamic. They can be around 22 columns (some times more or less). I created a staging table with 25 columns and import data into it. In essence each flat file that I import has different number of columns. They are all properly formatted only. My task is to import all the rows from a .csv flat file including the headers. I want to put this in a job so I can import multiple files into the table daily.
So inside a for each loop I have a data flow task within which I have a script component. I came up(research online) with the C# code below but I get error:Index was outside the bounds of the array.I tried to find the cause using MessageBox and I found it is reading the first line and the index is going outside the bounds of the array after the first line.
My File1Conn is the flat file connection instead I want to read it directly from a variable User::FileName
using System; using System.Data; using Microsoft.SqlServer.Dts.Pipeline.Wrapper; using Microsoft.SqlServer.Dts.Runtime.Wrapper; using System.Windows.Forms; using System.IO;
TABLE_NAME DESC CODE tab1 table1 A tab1 table1 B tab1 table1 C tab2 table2 D tab2 table2 E tab2 table2 G...
First column values are table names which are already exists in target database. Next two columns[Desc],[Code] data gets populate from CSV file to table.
In this scenario, how to load tab1 data into the same table in destination and so on.
Which way will be more standard to accomplish this task? If its a script task using C#, looking for clear script to identify a value changes in the first column.
I have a requirement wherein I have to setup the flat file connection manager to accept columns on fly. Meaning I want to retrieve list of columns/column count from the database when the package is run and set the connection manager with those many columns.
I have a situation where I want to load the Excel file dynamically, and the excel file have different columns or even worksheet name. How I could approach this? I believe there's no way to modify the meta data (specifically the mapping) in the data flow.
I have a text file which has rows 7 rows.I want to insert the data into SQL table using ssis In text file we have a column which has values as Y or N...I wanted to take only those rows which are Y...But we have only 6 rows in SQL table.It does not have the column with Y or N.
My Requirement ,In Source Database 5 tables are there ( Emp,Loc,dept,Time,Product ), Destination is Single Excel file.But Dynamically how to load each table information to load into each sheet wise through SSIS Package?
I have a delimited text file with 650+ columns. The sum of the column lengths of a single row, if fully populated, exceeds 30K bytes. The "killer" fields lengthwise are the "Description" fields. If they were removed from the input file, the remainig columns would occupy about 5000 bytes, which is within SQL max row length.
Can SSIS be used to created these two tables? (one without description fields, the other with those field but arranged vertically in the table rows).
The fundamental issue is I can not import a single file row into a sql table because that row length could exceed the max byte count for a row.
I have multiple xml data file in a directory say C:XMLData abc1.xml, abc2.xml, abc3.xml etc.
Need to loop through each file in ssis with Foreach loop container, and get the file name say abc1, and load the data of abc1.xml to abc1 table in sql server DB.
Next iteration will pick up the abc2.xml and find the abc2 table in sql server DB then insert the data in abc2 table.
While each iteration, xml source should also point each xsd file correspondingly.
Tables are already created in DB
I solved my problem up to getting the file name from ech iteration and assigned file name to variable, in oledb destination data access mode I select Table or view name variable, then corresponding table will get selected for data insertation.
Just wanted to know how can I read each xsd file for each xml data files while iteration.
i want to write a stored procedure where i pass column names a parameters and i want to get result based on that For ex:- if i pass the parameters as col3 and col5 where id =1 then i should the result as
id col3 col4 col5 1 3 4 5
and if i pass input as col2and col6 where id =3, the result should be id col2 col3 col4 col5 col6 3 4 8 2 6 9
I've been working on an SSIS package trying to load some data and the archive sequence is faulty. I've been trying to load a few tables created in a previous sequence into a local archive file and I've been getting the error "Could not find a part of the path."
The results aren't telling me what it's finding last and so I don't know where to start.
And the source DOES have data in it. It's something between the source and the destination.
I have 3 different companies that share the same ticket_types(CRMS System). I need to display the Ticket Types and the 3 company's Ticket Count:
Ticket Type | Company A Count | Company B Count | Company C Count
I can get the information individually for each company, but if a company doesn't have a ticket in one of the ticket_types, then it isn't displayed in a row. So, I tried to write the following, which isn't pulling back any data.
DECLARE @startdate date = '20150306' DECLARE @enddate date = '20151031' DECLARE @AcctGrp varchar(20) = '111' ;WITH TType AS ( SELECT ctp.description as TicketType
[Code] .....
If I run each SELECT individually from above (excluding the last SELECT), it works and I get the following:
TicketType AR Request Credit Availability/Rush Cancel Order Credit Card Payment Expedite Order Freight Quote
[Code] ...
How to get the query results? Am I even close to getting it right?
Hi friends , Can any buddy tell me how can i update a particular table by integration services.... I just need to update some of column value
if i write query ...i need to write approx 35 update statement (Query) So is there is any way by which i can replace existing data to my current data .
I have SQL Server 2012 SSIS. I have Excel source and OLE DB Destination.I have problem with importing CustomerSales column.CustomerSales values like 1000.00,2000.10,3000.30,NotAvailable.So I have decimal values and nvarchar mixed in on Excel column. This is requirement for solution.However SSIS reads only numeric values correctly and nvarchar values are set as Null. Why?
Have Visual Studio 2008 R2 with SP 2 installed. Due to a merger we now have a MySQL database that we need to update from SSIS. Everything works except for the table insert or update. Would upgrading to SP 3 or SP 4 maybe useful with that?
We have installed the latest driver from MySQL. Have tried the ADO.Net and ODBC drivers with similar results when we try to update the database.
Hi All. I'm an Oracle DBA who's currently being asked to look at a SqlServer Database. I need a list of columns per table, but am having trouble. I'll admit I might be being lazy here, but I'm in a hurry and using the valueable resources available to me!! Would really appreciate the sql i need to copy into the query window. Much obliged!! I need........ Table A Column1 Datatype Column2 Datatype Table B Column1 Datatype Column2 Datatype etc.... Many thanks.
May i know what is the SQL query to create another intermediate column which is used to store the sum of two columns of each and every record. Thank you.
There is a requirement to insert if there is a new record/update the existing record to Sharepoint. Am able to do Insert/Update to SharePoint if the data quantity is minimum, i.e, records in hundreds.
link: [URL] ....
if the records/data is more(in thousands), while updating to sharepoint, am getting below error. I tried to keep the minimal batch size, but still getting the same error.Is there any setting on Sharepoint that can be set to increase updation capability?
Error details:
System.ServiceModel.ProtocolException: The content type text/html; charset=utf-8 of the response message does not match the content type of the binding (text/xml; charset=utf-8). If using a custom encoder, be sure that the IsContentTypeSupported method is implemented properly. The first 1024 bytes of the response were: ' <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
I have a situation in SSRS to get the common values between the two columns where the values are sorted comma separated as below.Ex:
ColumnA : abc,cde,efg ColumnB : cde,xyz,abc
the result in
ColumnC : cde,abc
similarly Column A and B will have n number records. I need to right an expression or the Code function to get the required result in ColumnC. I am using SharePoint Lists as Datasource. Cannot write SQL query to achieve this requirement.
The following returns all base tables within the database of type "varchar":
Code: SELECT TABLE_SCHEMA, TABLE_NAME, COLUMN_NAME FROM mydb.information_schema.columns WHERE TABLE_SCHEMA = 'master' AND TABLE_CATALOG = 'mydb' AND DATA_TYPE IN('varchar')" AND TABLE_NAME IN( SELECT TABLE_NAME FROM mydb.information_schema.tables WHERE TABLE_TYPE = 'BASE TABLE' AND TABLE_CATALOG = 'mydb' AND TABLE_SCHEMA = 'master')
What I then want to do is... For each of these results:
Code: select [COLUMN_NAME] from [TABLE_SCHEMA].[TABLE_NAME] WHERE ID = 'test'
Is it possible to do this in one SQL command? Or do I manually have to do it for each in the list from my first query?