AS Processing Task

Nov 27, 2005

When using the AS processing task with a connection to "an Analysis Services project in this solution", only some processing options are available for processing dimensions. For instance, it is not possible to select "Process Update". Once I change the connection manager to point to the deployed cube database, I can choose from all the options. Is this by design?

View 1 Replies


ADVERTISEMENT

OLAP Services Processing Task

May 24, 2002

Hi Guys.

For SQL 2000 is there any addin available for DTS task?

If not how can i automate it?
Advance thanks

-MAK

View 1 Replies View Related

Analysis Services Processing Task

Apr 9, 2008

Hi. I have an Analaysis Services (2005) cube with four dimensions and one fact table (with three partitions - 2006,2007,2008) for which I need to create an SSIS package to process. I only want to process one of the three partitions (2008) - the previous two years should remain unchanged.


This is what I have currently in the Analysis Services Processing Task under Processing configuration:
- An object for each of the dimensions with "Process Full."
- An object for the 2008 partition with "Process Full."

(Note - Under Process Options, I see only Process Default, Process Full, Unprocess, and Process Data for dimensions and partitions).


Batch settings are:
- Processing order: Sequential
- Transaction mode: All in one transaction
- Dimension errors: Ignore errors
- Process affected objects: Do not process


When I execute the package, the cube loses the 2006 and 2007 data.

I am assuming that I have an issue with the Process Option or the Batch Settings, and I would appreciate any guidance!


Thanks,
Marianne

View 6 Replies View Related

Analysis Services Processing Task

Mar 11, 2008



Hi all, here is my problem:

The last node of my workflow in SSIS is an analysis Services Processing Task, which is supposed to fully reprocess a cube, defined in a different project.

In the configuration, I found the correct cube and setups for it, I thought I wasn't gonna have any problems with it, but it started to complain about user and password information. I thought since the databases configured itself when I added them, the same thing would happen with this Task.

I do have my own user and pass which has permissions to reprocess the cube, although I thought windows authentication would be better then setting up a user and password for the application/task.

I looked in the entire configuration pane and found no information regarding username and password.
Where should I set it up, my SSIS solution or the Cube's solution?

This might be a newbie question, I'm not quite sure...

EDIT: Here is the error message:
[Analysis Services Execute DDL Task] Error: The following system error occurred: Logon failure: unknown user name or bad password. .

View 5 Replies View Related

Analysis Services Processing Task

Feb 21, 2008



Hello

I am trying to run Olap Cube 2000 inside SSIS project.
I am using "Analysis Services Processing Task" Object.
The Visual Studio Project is sitting on the machine where the
analysis 2000 is running but yet i get an error while establish
a connection to the Analysis server.

On that machine also install MICROSOFT SQL SERVER 2005 .

the error is:
A Connection Cannot be made . Ensure that the server is running.

Does Anybody have an idea to why i get this Error.

Thanks,

View 1 Replies View Related

RollBack Analysis Processing Task

Mar 7, 2008



Hi there,

i have got a SSIS Package, that contains a sequence container with transactionoption "required". Within this sequence I placed different AS processing tasks and different SQL tasks. The transactionoptions of these tasks are set to supported.

My problem: in the case a SQL task fails on execution, all executed tasks are rolled back except the AS processing tasks. The expected and necessary behavior should be, that also the AS processing tasks get rolled back.

Has anyone got a solution or a workaround for this problem?

Thanks.
Andi

View 3 Replies View Related

Temdb Is Full When I Run DTS Task With OLAP Processing

Oct 20, 2006

Hi,

I have some problem:
Evryday I autmoaticaly run SqlAgent job with DTS task that run tasks:
1. SQL Task: Shrink tempdb database
2. Analysis services task: Process ALL Database

From some time the job fails to run. I have error indicating that tempdb is full
The error string is: "The log file for database 'tempdb' is full. Back up the transaction log for the database to free up some log space"

The size of temdb file growth to 20GB.
When I process Databse manually in Analysis Manager the database process correctly (because Analysis Manager do not use tempdb but temp folder)

What I should do in this case?
I shrink tempdb before every processing so back up of transaction log will not help me.
Any sugestions?


I'am using SBS 2000 Standard Edition ENG with installed components:
Active Directory
Exchange 2000
SQL 2000

System has 4GB RAM and 2 XEON CPU.


Thanks for any info.

Regards,

Dariusz Jankowski

View 2 Replies View Related

Where Is The 'Analysis Services Processing Task' Logging To

Apr 1, 2008

All,

I 'm using 'Analysis Services Processing Task' as part of a SSIS package to refresh the cube. in the property page,

the 'loggingMode' is set 'enabled', but there is no records in the sysdtslog90 table while all other tasks are logged in the table. How to logging into the sysdtslog90 table?



Thanks in advance

Jessie

View 3 Replies View Related

Error - Analysis Services Processing Task

Jul 11, 2007

We have an Integration services package that executes a few TSQL tasks, then processes an Analsys Services database. This has been in production for about three weeks now and twice the package has failed with this error from the event log:



Event Type: Error

Event Source: MSSQLServerOLAPService

Event Category: (289)

Event ID: 3

Date: 7/11/2007

Time: 1:48:59 AM

User: N/A

Computer:

Description:

OLE DB error: OLE DB or ODBC error: An error has occurred while establishing a connection to the server.

When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server

does not allow remote connections.; 08001;

Communication link failure; 08S01;

TCP Provider: An existing connection was forcibly closed by the remote host.

; 08S01.

For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.



I don't think that this error is accurate because the package and Analysis Services are on the same server.



Also, this does not happen in our development environment. Any help is appreciated.



Thanks,

Brian

View 1 Replies View Related

OLAP Services Processing Task Error....(Urgent)

Apr 27, 2001

Hello SQL World,

I have created a DTS package which should process an Incremental Update OLAP Cube ... however it is generating the following error message ... HELP has anyone seen this before ?

Error: -2147221499 (80040005); Provider Error: 0 (0)

Error string: Provider generated code execution exception: EXCEPTION_ACCESS_VIOLATION

Error source: Microsoft Data Transformation Services (DTS) Package

Help file: sqldts.hlp

Help context: 700


TIA,
Paul

View 1 Replies View Related

SSIS Analysis Services Processing Task - Big Problem!

Nov 1, 2007

I have an olap database "A" and SSIS package "P" which process all the dimensions and cubes in "A" olap database.

I created "A1" olap database copy of "A" and made copy of "P" SSIS package as "P1"
I opened "P1" SSIS package and updated olap connection properties "Initial Catalog = A1". A1 is my new olap database.

When I run package "P1" guess what? it processed "A" olap database's cubes and dimensions. Try it, not in production because I did it in production.

View 12 Replies View Related

Analysis Services Processing Task - Cube No Longer Updating

Oct 19, 2007



We have set up an IS package to process an AS 2005 database (comprising cube & dimensions, etc) daily, via a SQL Server Agent job on both development and production systems. This has been working fine for months.

A new dimension was added to the cube on the development system - automatic processing via the IS package continued without issue. However, when the new dimension was added to the production system the IS package no longer processes the cube correctly. Although all appears ok (and all is present and correct in the logs), no data updates to the cube are made. Only when the cube is manually processed does the cube get updated.

Anyone got any ideas about how to get around this issue? We have created a new IS package, with a single Analysis Services Processing Task, and tried this but get the same outcome.

View 4 Replies View Related

SSIS-Analysis Services Processing Task- On Client Fails

May 14, 2008

I am trying to execute an SSIS package from a client that contains an Analysis Services Processing Task in the package. The client that does not have SSIS and SSAS installed. We are getting an error

The task "Analysis Services Processing Task" cannot run on this edition of Integration Services. It requires a higher level edition. The same package runs from a server that has both SSIS and SSAS installed. Let me know if someone has come across the same problem.

Thanks

View 1 Replies View Related

Analysis Services Processing Task: Logging And Error Handling

Mar 5, 2007

I have an Analysis Services Processing Task in my SSIS package. I run the SSIS package using SQL Server job, the running of the package is a job step.

When I process manually the analysis services objects (in practise cubes) using dtexec utility I get a lot of log. In case the processing fails I get error messages that quite well describe the error. But when I run the job the only information I get in the job log is that the job step failed. I know the failure happens in the Analysis Services Processing Task.

Is there any way in SSIS to get a) the log of the Analysis Services processing or b) the error messages of the Analysis Services processing? Or should the processing be done some other way than I've been doing?

View 4 Replies View Related

Problems With Connections And Analysis Service Processing Task In SSIS

May 30, 2008

Hi everybody,

I'm fairly new to the SSAS/SSIS world (though not new on databases, etc.) and I'm having some problems with the SSIS packages in our Cube environment.

Currently in our SSAS/SSIS project, we have two major connection managers, one to the database we use for loading the Cube, and the other connector for the cube itself. To load the data from the database to the cube, we wrote some SSIS packages and used the Analysis Service Processing tasks to process all the dimensions and measures. This works pretty good, so no problems here.

The real problem starts, when I try to change the connection parameters, e.g. because the server changed, or the database has been renamed.
As soon as the connection managers points to another (existing) cube, regardless if the structure is exactly the same as the one of the old cube, the tasks lose all the assigned objects from their lists. It is really annoying to add all these exactly same objects to the task again. I tried experimenting with the DelayValidation attribute so the Development Studio doesn't destroy my work every time, but when I deploy the package the Cube breaks. Obviously some kind of deeper connection is destroyed when I change the connection string.

Is there a way to prevent the package from breaking/losing objects, without me having to sacrifice 15 minutes every time I change the connection parameters?

Regards,
Tris

View 4 Replies View Related

SSIS Processing Cube Task, Only IP Address Works, Put Server Name Will Get Error

Nov 15, 2007

I have an SSAS 2005 database "A" and SSIS package "P" which process full "A" olap database.
SSAS SERVER connection string is based on a variable read from XML configuration file.

It works well in BIDS, but when i deployed, the package failed at the step connecting SSAS, the message is "a connection cannot be made, please ensure the server is running"

In the connnecting string, i am using server name like servera.xx.com, if I change it to IP address, it works.
if I change it to Localhost(happens to be on the same server), it works.

But I need the server name solution as IP may be changed.

I installed SP2.

Any suggestion?

Thanks and regards

View 2 Replies View Related

Changing 'initial Catalog' On Connection Causes Analysis Services Processing Task To Fail

May 2, 2008

We find that if we deploy the OLAP database with a different name on the test server, then regardless of how we change the connection string provided to the SSIS package that processes the cube, then the package fails to connect to the database. To clarify:

In development the OLAP database is called MyOlapDB and the source database is called MySqlDB. Both are on the same machine. When the the application is built and released for test, the test team install the databases on a replica of the production environment (i.e. web app on one machine, OLAP DB on another and SQL database on yet another). They also, quite rightly, implement the new test databases so they incorporate the build version number. So, MyOlapDB123 and MySqlDB123 are both from build 123.

This is when the problems start. Regardless of how the connection string is specified in the job that processes the cube, the SSIS integration package fails with the error:

[Analysis Services Execute DDL Task] Error: Errors in the metadata manager. Either the database with the ID of 'MyOlapDB' does not exist in the server with the ID of 'OurTestServer', or the user does not have permissions to access the object.


We have tried config files and job properties, but neither work. Also, simply attempting to run the package using the DTEXECUI does not work either.

Looking inside the XML of the package, we clearly see the ConnectionManager object which has the original connection string, which is

Data Source=localhost;Initial Catalog=MyOlapDB;Provider=MSOLAP.3;Integrated Security=SSPI;Impersonation Level=Impersonate;


However, editing the initial catalog here still does not solve the problem. Searching the XML for the string MyOlapDB reveals the OLAP database name in two other places - both within the object data of the two Analysis Services Execute DDL tasks.

Anyone know how to solve this problem without having to hack the XML of the package?

View 4 Replies View Related

Analysis :: Cube Needs To Be Deployed From VS After SSIS Analysis Services Processing Task Completes?

May 13, 2014

I have a cube that we are processing nightly via an Analysis Service Processing Task in SSIS.  In order to increase the performance of the processing time, we elected to use a lot of rigid dimension attributes, and do a full process of everything in the SSIS task.  The issue that I am having is that after that task completes, I need to go into Visual Studio to deploy the cube becuase we are unable to browse or use the cube.  This issue seemed to start once we changed the SSIS Analysis Service Processing Task to do a full process on the dimensions, rather than an incremental.

I would expect that once development is done, and it is processed and deployed, that is it.  My thinking is that the SSIS task should just update the already deployed cube,

View 2 Replies View Related

Send Mail Task Problem Using A Combination Of ForEach Loop, Recordset Destination, Execute SQL Task And Script Task

Jun 21, 2007

OK. I give up and need help. Hopefully it's something minor ...



I have a dataflow which returns email addresses to a recordset.

I pass this recordset into a ForEachLoop configuring the enumerator as (Foreach ADO Enumerator). I also map the email address as a variable with index 0.



I then have a Execute SQL task which receives this email address as a varchar variable (parameter 0) which I then use in my SQL command to limit the rows returned. I have commented out the where clause and returned all rows regardless of email address to try to troubleshoot this problem. In either event, I then use a resultset to store the query result of type object and result name 0.



I then pass this resultset into a script variable to start parsing the sql rows returned as type object. ( I assume this is the correct way to do this from other prior posts ...).



The script appears to throw an exception at the following line. I assume it's because I'm either not passing in the values properly or the query doesn't return anything. However, I am certain the query works as it executes just fine at the command prompt.



Try

ds = CType(Dts.Variables("VP_EMAIL_RESULTS_RS").Value, DataSet)



My intent is to email the query results to each email address with the following type of data by passing the parsed data from the script to a send mail task. Email works fine and sends out messages but the content is empty. I pass the parsed data as string values to the messagesource and define the messagesourcetype as a variable in the mail task.



part number leadtime

x 5

y 9

....



Does anyone have any idea what I might be doing wrong?

thanks

John

View 5 Replies View Related

Integration Services :: Stored Procedure In Execute Task Fails But Task Does Not Fail

Jul 1, 2015

I'm using SSIS in Visual Studio 2012. My Execute SQL Task calls a Stored Procedure where I have a TRY-CATCH. Last week there was a problem and the CATCH was executed and logged an error to my error table, but for some reason the Execute SQL Task didn't fail. Is there a setting to make the Execute SQL Task fail when an SP encounters a failure?

View 3 Replies View Related

Error: The Task With The Name Data Flow Task And The Creation Name DTS.Pipeline.1 Is Not Registered For Use On This Computer

May 4, 2006



Hi,

I am trying to create a simple BI Application for SSIS. In Visual Studio 2005 I just get a Data Flow Task from the toolbar and add it to the project. When I double click it I get the following error:

The task with the name "Data Flow Task" and the creation name "DTS.Pipeline.1" is not registered for use on this computer.

Then when I try to delete it it gives this other error:

Cannot remove the specified item because it was not found in the specified Collection.

I am creating this application in an administrator account in this computer, so I doubt the problem is related to permissions. I am running SQL Server 2005 and Visual Studio 2005 in WinXP Tablet PC Edition.

Any suggestions why this is happening and how to fix it?

View 17 Replies View Related

SSIS Task Transfer SQL Server Objects Task And Default Constraints On Tables

Feb 21, 2008



I am using the "Transfer SQL Server Objects Task" to copy some tables from database A to database B including data.

The tables, primary key constraints, Foreign key, data and all transfers nicely except for "DEFAULT CONSTRAINTS" on the tables.

I have failed to find any option in the "Transfer SQL Server Objects Task" task to explicitly say "copy default constraints". So I guess logically it should happen automatically but it doesn't. I hope it is not a bug :-)

Any option anyone knows will help.

Thanks.

View 17 Replies View Related

Compare Performance (Execute SQL Task Insert And Data Flow Task)

Mar 12, 2008



I am using SQL 2005 SSIS. I am joining several large tables and then the move result into another table in the same database.

I would like know which method is faster:


Use Execute SQL Task to insert the result set to the target table

Use the Data Flow Task to insert the result set to the target table. (Use OLE DB source to execute SQL command and then use the SQL destination)
Could you tell me why then other is slower?

Thanks.

View 7 Replies View Related

Execute SQL Task – Output Parameter On Stored Procedure Causes Task To Fail.

Dec 2, 2005

I have a SQL Task that calls a stored procedure and returns an output parameter.  The task fails with error "Value does not fall within the expected range."   The Stored Procedure is defined as follows: Create Procedure [dbo].[TestOutputParms]             @InParm INT ,             @OutParm INT OUTPUT as Set @OutParm  = @InParm + 5   The task uses an OLEDB connection and has a source type of Direct Input.  The SQL Statement is Exec TestOutputParms 7, ? output    The parameter mapping is: Variable Name Direction Data Type Parameter Name User::OutParm Output LONG @OutParm  

View 7 Replies View Related

Writing Full Result Set From Execute SQL Task Into A File Using Script Task

Mar 28, 2007

In the Control flow tab, I have an Execute SQL Task that outputs full Result set into a variable of an object type. Now how can I write the contents of the Full Result Set into a text file using Script Task. I also want to format the following way while I output into a file:

Column Name 1 : Column Value

Column Name 2: Column Value and so on

I tried writing the contents of the Object Variable into a file, but the file had an output of single word: System.__ComObject.




Code for Writing the Full Result Set into a Text File

Dim RSsqloutput as String = Dts.Variables("objVariable").Value.ToString

Dim strVal as String = "File completed on " & Now() & vbCrLf & "------------------------------------------------------" & vbCrLf

oLogFile.WriteAllText("C:MyFile.txt", strValue)

oLogFile.WriteAllText("C:MyFile.txt", rsSQLOutput)



I went through this link that explains how to write XML Result Set into a File, But this doesn't help as it writes in XML format.

Would you please give me a hint of code how I can go upon.





View 7 Replies View Related

SSIS (Integration Services) Transfer SQL Server Objects Task: This Task Can Not Participate In A Transaction

Feb 1, 2007

In short, does the €œTransfer SQL Server Objects Task€? support distributed transactions?

In trying to use a €œTransfer SQL Server Objects Task€? in a container using a transaction on the container. The task is set to support the transaction. It is setup to copy table data from several tables from a non-domain server (sql server 2000) to a domain-based server (sql server 2005). I get an error stating, €œThis task can not participate in a transaction€?.

I am wondering if it means exactly what it says €“ this task in SSIS can€™t participate at all. Or does it mean that it won€™t in this scenario for some reason. I attempted a simple copy of data from mssql 2005 to mssql 2005 (same server) and the task still failed). MSDTC appears to be running properly on my machine and such (I can do a simple distributed transaction across linked server to the 2000 server in Query Analyzer (QA)). Also, MSDTC appears to be working on both servers with distributed transaction query tests in QA.

Here€™s the error info€¦

SSIS package "Development BusinessContacts and Products Migration.dtsx" starting.
Information: 0x4001100A at Copy BusinessContacts Data: Starting distributed transaction for this container.
Error: 0xC002F319 at Copy BusinessContacts database table data 1, Transfer SQL Server Objects Task: This task can not participate in a transaction.
Task failed: Copy BusinessContacts database table data 1
Information: 0x4001100C at Copy BusinessContacts database table data 1: Aborting the current distributed transaction.
Information: 0x4001100C at Copy BusinessContacts Data: Aborting the current distributed transaction.
SSIS package "Development BusinessContacts and Products Migration.dtsx" finished: Failure.
The program '[4700] Development BusinessContacts and Products Migration.dtsx: DTS' has exited with code 0 (0x0).

View 9 Replies View Related

How To Fetch The Recrods From MS Access And Using It In Script Task Using Control Flow Tools(Execute SQL Task)

Jun 14, 2006

Hi

I have an application like fetching records from the DataBase(MS Access 2000) and results i have to use in Script Task. At present i have used the record fetching query,connection string in Script itself. I would like to use in Independently. Is there any Tools like (Control Flow Tools like Execute SQL Task) are there to fetch the result set from Acccess and can use the fetching results in Script Task....

Thanks & Regards

Deepu M.I

View 5 Replies View Related

Can A Result Set From SQL Script Task Be Used As A Source For Data Flow Task?

Oct 2, 2007

I have a stored procedure that is executed via a sql script task that returns a full result set. I map this result set to a variable or object type. Is there a way to use this variable as a data source in a subsequent data flow task?

A.

View 14 Replies View Related

DTS PROCESSING

Aug 3, 2001

I have a DTS that imports data from an orcle database into SQL Server.
Doesn't the processing mostly occur on the SQL Server, not on the
oracle database from which the data is being imported?
The oracle database is vendor provieded and they are saying
our SQL Server DTS package is killing their server.
Any insight is appreciated.
Thanks

View 1 Replies View Related

XML Processing

Mar 27, 2008

Hey All,

I've got a process that creates records in my database based on XML input that I've gotten. What I am doing it giving this XML to a stored procedure to handle a specific task, then modify the XML and send it to the next stored procedure.

For instance, the XML could hold header records with detail records, I would first send the XML to a stored procedure that creates the header records, then updates the XML so the XML now knows the identity values of the header records I have just created, and then send the XML to the next stored procedure to create the details for those headers.

All works great and fine, but I have a problem with writing the identity values back to the XML. It seems I can only change one item in the XML at a time and thus need to loop this. For many records this really takes a long time.

Here is some sample code of what I'm doing (please excuse any typos, this is a simplified version of the code) :

declare @lvSeq numeric(15)
declare @lvRowNo int
declare @lvNumRows int

insert into myHeaderTable (
recid, recdesc
) select
ref.value('@recid', 'nvarchar(25)') recid,
ref.value('@recdesc', 'nvarchar(250)') recdesc
from @pXML.nodes('//headers/header') R(ref)

select @lvRowNo=1, @lvNumRows = @pXML.value('count(//headers/header)', 'int')
while (@lvRowNo<=@lvNnumRows) begin
select @lvSeq = recseq
from myHeaderTable
where recid = @pXML.value('//headers/header[position()=sql:variable("@lvRowNo")]/@recid)

set @pXML.modify('replace value of (//headers/header[position()=sql:variable("@lvRowNo")]/@recseq with sql:variable("@lvSeq")')

select @lvRowNo=@lvRowNo+1
end


Obviously I am looking for a better way to update the XML with the sequences. The insert takes a second, the loop takes minutes with large XML sets. I guess MSSQL is searching the whole XML to find the item to update.

It would be nice if I didn't have to loop through the XML. One solution I was thinking off is to store the XML in a temporary table with a single record per header item. Then I could do the modify in one go and recreate the XML by simply selecting the contents of the temporary tabel. I have no idea if this is possible.

So something like this:

select
ref.value('@recid','nvarchar(25)') recid,
ref.value('.','XML') XMLData -- this gives an error
into #TMP_XML
from @pXML.nodes('//headers/header') R(ref)

insert into myHeaderTable (
recid, recdesc
) select
recid,
ref.value('@recdesc', 'nvarchar(250)') recdesc
from #TMP_XML CROSS APPLY XMLData.nodes('/header') R(ref)

update #TMP_XML
set XMLData.modify('replace ....')
from myheadertable
where #TMP_XML.recid = myheadertable.recid

-- recreate XML here, not sure how....

View 1 Replies View Related

How To Enable Or Disable A Task Programmatically Using Script Task

Mar 6, 2006

I have made one package which extracts data from the source does transformation and submits the data to destination. Subsequently it also updates the required control files.

Now I want to add a functionality :

If the package is executed again it should check the status of previous execution in control file if success mark all tasks disable and stop

if failure mark all tasks at enable and start extracting data and continue further with execution.

I was able to attain similar functionality in SQL Server 2000 using activeX script. What code do I need to write as a part of Script Task in order to attain above functionality.

View 3 Replies View Related

Execute DTS 2000 Package Task. Mischievous Task??

Sep 21, 2006



Hi everyone,

For first time I'm testing this task and surprisingly, when I try "Edit Package" option:


1)The DTS host failed to load or save the package properly
2)The selected package cannot be opened
3)Error HRESULT E_FAIL has been returned from a call to a COM component

But after these messages you can see all the tasks but they haven't name!!

It seem as if RCW mechanism has failed between managed and unmanaged coded-partially.

I don't dare to follow doing more stuff, I don't know if that package is well-loaded or not from there. ?¿

Any guidance or idea about this?

View 5 Replies View Related

How Do I Start A Transaction :-Dataflow Task + Excute SQL Task

Mar 7, 2007

1 :Control Flow Excute SQL task: Truncate Table

2: Dataflow Task: Datareader--Script componant--OLE DB Destination (SQL Server 2005--a single table --always around 600,000 rows)

How do I set up a transaction where if there is a failure the Truncate Table command will roll back---and the OLE Destination (A single SQL Server table) will be left the same as before the load started.

Another question with that volume of data --600,000 rows will a truncate table be pratical in a transaction

Any ides welcome

thanks in advance

David

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved