Historical Tables, Partitioning Or What?

Apr 4, 2007

I have about 45000 records in a CSV file, which I am using as HTTP request parameters to query a website and store some results in a database. This is the kind of application which runs 24/7, so database grows really quickly. Every insert fires up a trigger, which has to look for some old records based on some criteria and modify the last inserted record. My client is crazy about performance on this one and suggested to move the old records into another table, which has exactly the same structure, but would serve as a historical table only (used to generate reports, statistics, etc.), whilst the original table would store only the latest rows (so no more than 45k at a given time, whereas the historical table may grow to millions of records). Is this a good idea? Having the performance in mind and the fact that there's that trigger - it has to run as quickly as possible - I might second that idea. Is it good or bad? What do you think?



I read a similar post here, which mentioned SQL Server 2005 partitioning, I might as well try this, although I never used it before.

View 5 Replies


ADVERTISEMENT

Historical Tables

Mar 13, 2008

Hi,

I have a question about historical table. I have a table in Sql Server that keeps the history of my clients for the last 3 months.
This table has a PK : client_code and rep_date_id. I want to do a query that returns me a client and to extract the date when i inregistreted for the first time.

I think about a cursor, but i don't know how to use it.
My table looks like this structure:
- client_code
-activity_code
-country_code
-district_code
-...
-rep_date_id_n

And I want to return
--activity_code
-country_code
-district_code
-...
-start_date
-current_date

Thanks

View 1 Replies View Related

Partitioning Tables

May 4, 2007

How to partition tables in dynamic way?

I want to partion a table based on the client specific id and I want these values (client id's) to be passed dynamically to the create partition function.
I am not familiar with partitioning so it will be great if someone guides me (I am also reading some articles on partitioning, but it will be easier with some help)

The table I am trying to partition has like 80 million rows with four client's data as of now and will be more once we implement new clients.

I also think Partition will help, because before we load a client's data, we remove the data that is already out there (we flush previous qtr data before we insert this qtr data)

Any help will be appreciated.

Thanks
Raj

View 4 Replies View Related

Partitioning Tables Vs San Hardware

Feb 26, 2008

Guys,

I am trying to use table partition feature from Sql Server 2005 enterprise edition.

I have Names table with columns FNAME, LNAME and DISPLAYNAME (concatenation of FNAME and LNAME) which I partitioned across 2 drives and 4 file groups based on the below criteria.

CREATE PARTITION FUNCTION pfNameRange(varchar(200))
AS RANGE RIGHT FOR VALUES ('F', 'I', 'S');

Currently there are 5 mill rows in this partitioned tables - partitioned table has clustered index on ID (identity property) and LNAME.

I also created another table with the same data without partition on the table.

When I run the following query I get the same response time of 10secs from both tables.

Names - partitoned table with clustered index on ID and Lname
NameSEARCH - with no partition and no index

select * from names where lname = 'smith'
select * from namesearch where lname = 'smith'

Is it safe to assume that if the data files are on San it doesnt give any advantage of table paritioning?

How can paritioning be made effective with data files on San

Any suggestions and inputs would help.

Thanks

View 4 Replies View Related

SQL 2005 - Partitioning Tables

Oct 29, 2005

I want to know how to partition a table using two columns (Example: Salesman, OrderDate).

View 8 Replies View Related

Tables Partitioning -- Performance Boost

Nov 13, 2007

Hi

I have a question about the partitioning a table.

I have a database with more 50 tables and 25 tables are having more than 10 lakhs records which includes history records.I have two data files for this database under PRIMARY FILE GROUP.Now i want to transfer these history records to some other database.
I wanted to know if this kind of activity will boost the database performance?.If yes how should i configure my new database.
On what factors of partitioning my performance will boost.

Thanks in advance

Regards
Arvind

View 1 Replies View Related

Partitioning Tables And Filegroup Stuff

Jan 17, 2007

Hi everyone,
 
Primary platform is 64-bit on A-P cluster.
 
Our needs are on yearly basis and on monthly basis. We're forced to keep up five years for the majority of the production tables.
In terms of years, I see three ways:
 
1.Create all the ranges for a FILEGROUP with a only NDF
 
2004, 2005, 2006 => FG1 =>  ONE.NDF
 
2.Create all the ranges for a FILEGROUP along with more than one NDF.
 
2004, 2005, 2006 => FG1 =>  ONE.NDF
                                                       TWO.NDF  
 
3.Create each range to a FILEGROUP where there will be  one NDF or  (n) NDF
 
2004 => FG0=> ZERO.NDF
2006 => FG1 => ONE.NDF
2005 => FG2 => TWO.NDF
 
What is the best approach in terms of availability, performance and best practices? Maybe is a silly question, I'm sorry if it is.
 
As usual, thanks a lot for your time.
 

View 3 Replies View Related

SQL Server Admin 2014 :: Partitioning Master And 4 Child Tables

Jul 5, 2014

I have 6 tables which are very huge in row count and need to be partitioned for better manageability.

Little info: Every day, 300 Million records are inserted and 300 million records are deleted in below 7 tables. we maintain only 8 days worth of data in below tables which is the reason records which are older than 8 days are continuously deleted.

Master table which has [ID],[Timestamp]
Table Name: Sample - 2,578,106

Child tables: Foreign key [ID] is common for all the tables. There is no timestamp column in child table.
dbo.ConnectionDB - 1,147,578,048
dbo.ConnectionSS - 876,458,321
dbo.ConnectionRT - 118,133,857
dbo.ConnectionSample - 100,038,535
dbo.Command - 100,032,235

I would like to partition the above child tables based on the IDs that are inserted every 4 hours. Meaning, All IDs that are inserted in 4 hours window should be in a partition.

View 1 Replies View Related

SQL Server 2008 :: Retrofitting Partitioning To Existing (large) Tables

Feb 9, 2015

We have an existing BI/DW process that adds large chunks of data daily (~10M rows) to an existing table, as well as using Deletes to remove stale data. This scenario seems to beg for partitioning to support switching in/out data.

After lots of reading on this, I have figured out the mechanics of the switching, bit I still have some unknowns about the indexes needed to support this.

The table currently has several non-clustered indexes, including one on the partitioning column - let's call that column snapshotdate. Fortunately there are no FKs involved, and no constraints.

Most of the partitioning material I see focuses on creating a clustered PK to assist with switching. Not sure if this is actually necessary, but assume I create one using an Identity column (currently missing) plus snapshotdate.

For the other non-clustered, non-unique indexes, can I just add the snapshotdate to the end of the index? i.e. will that satisfy the switching requirement?

View 1 Replies View Related

Historical Data

Sep 19, 2005

Hi Everyone.

I need some advise on how to create a historical database.

What is the best way of doing this? For example, should I create a new row if a column is changed in a row and Time Stamp all record? What happens when I have child tables link to a Header table?

I have been looking on the NET for methods of creating a historical database, but I cant find any.

Thanks in advance

View 11 Replies View Related

Historical Data

Oct 20, 2005

A general data design question:We have data which changes every week. We had considered seperatinghistorical records and current records into two different tables withthe same columns, but thought it might be simpler to have them alltogether in one table and just add a WeekID int column to indicatewhich week it represents (and perhaps an isCurrent bit column to makequerying easier). We have a number of tables like this, holding weeklydata, and we'll have to query for historical data often, but only backthrough the last year -- we have historical data going back to 1998 orso which we'll rarely if ever look at.Is the all-in-one-table approach better or the seperation of currentand historical data? Will there be a performance hit to organizing datathis way? I don't think the extra columns will make querying too muchmore awkward, but is there anything I'm overlooking in this?Thanks.

View 1 Replies View Related

Storing Historical Data?

Apr 7, 2008

From what I've read this is called 'slowly changing dimensions'. Bassically the system I'm working on needs to store the history of certain data so that at any time a user can look up an old project and view it exactly as is, even though the associated parts might have had certain changes over time.  From what I can tell Type 2 ( current and historical records are stored in the same table) seems to  be the most popular.  Type 4 (current records in one table and historical records in a seperate history table) seems like it would also work but I've been unable to find any articles comparing the two.  Does anybody have any info on the dis/advantages of one v.s. the other?

View 3 Replies View Related

Historical Data Problem

May 3, 2004

We have a database that adds over 100,000 records per day. This goes back to 2002 and I only need historical data for 6 months. Presently we can only can delete 1000 row at a time. Is there a faster way of deleting. We seem to continuely run out of disk space. Urgent!!!!!!!!!!!

View 7 Replies View Related

BACKUP TABLE(historical)

Jul 20, 2005

How to build a TRIGGER that copies, all the time, the data from the table onwhich the transaction occurs to the historical table?thanksFernand---Outgoing mail is certified Virus Free.Checked by AVG anti-virus system (http://www.grisoft.com).Version: 6.0.659 / Virus Database: 423 - Release Date: 2004-04-15

View 1 Replies View Related

SCD Historical Component Not Working

Mar 21, 2006

I have an SCD that contains a historical output path. If I throw dataviewer in the flow before the SCD, I can see that it should trigger a trip down that lane, but its not. In the advanced editor, all looks good. Anything I can check. BTW, the datatype of the fields that should cause the SCD are datetime.



thanks in advance

View 1 Replies View Related

How To Retrieve Historical User Queries

Mar 2, 2012

I have DWH where users query on regular basis. I wanted to know what queries they run and which user fired the queries for the last one month or 6 months.

View 1 Replies View Related

Historical Prices With Data Gaps

Apr 24, 2008

Hi,

I have a SQL2005 db for tracking the prices of products at multiple retailers. The basic structure is, 'products' table lists individual products, 'retailer_products' table lists current prices of the products at multiple retailers, and 'price_history' table records when the price of a product changes at any retailer. The prices are checked from each retailer daily, but a row is added to the 'price_history' only when the price at the retailer changes.

Database create script:
http://www.boltfile.com/directdownload/db_create_script.sql

Full database backup:
http://www.boltfile.com/directdownload/database.bak

Database diagram:
http://www.boltfile.com/directdownload/diagram_0.pdf

I have the following query to retrieve the price history of a given product at multiple retailers:

SELECT
price_history.datetimeofchange, retailer.name, price_history.price
FROM
product, retailer, retailer_product, price_history
WHERE
product.id = 'b486ed47-4de4-417d-b77b-89819bc728cd'
AND
retailer_product.retailerid = retailer.id
AND
retailer_product.associatedproductid = product.id
AND
price_history.retailer_productid = retailer_product.id

This gives the following results:

2008-03-08 Example Retailer 22.3
2008-03-28 Example Retailer 11.8
2008-03-30 Example Retailer 22.1
2008-04-01 Example Retailer 11.43
2008-04-03 Example Retailer 11.4

The question(s) I have are how can I:

1 - Get the price of a product at a given retailer at a given date/time
For example, get the price of the product at Retailer 2 on 03/28/2008. Table only contains data for Retailer 1 for this date, the behaviour I want is when there is no data available for the query to find the last data at which there was data from that retailer, and use the price from that point - i.e. so for this example the query should result in 2.3 as the price, given that was the last recorded price change from that retailer (03/08/2008).

2 - Get the average price of a product at a given retailer at a given date/time
In this case we would need to perform (1) across all retailers, then average the results

I'd really appretiate anyone's help on this :)

many thanks in advance,

dg

View 17 Replies View Related

Historical Reporting On Changing Data

Apr 27, 2008

I've got a customer who wants reproducible/historical reporting. The problem is that the underlying data changes.

I tried to explain that this can't be done (can it?), but he doesn't
understand.

To illustrate the situation - Let's say a teacher wants to track
spelling test scores for her students.
The below are scores for students A, B, and C (for January, February, March)

A: {70,80,85}
B: {70,65, 80}
C: {100,90,100}

So, I can generate a historical report that charts the class average
and student trend - that's pretty easy.

Now, in April, we find that the school board has mandated that the
British spelling of words is ok, so now the cumulative scores (for
January, February, March, April)

A: {90,80,85,100}
B: {80,65, 80,80}
C: {100,90,100,75}

He wants a report showing the January average as (70+70+100)/3 = 80,
when really it is (90+80+100)/3 = 90.

Now imagine that there are actually thousands of data points changing like this...
Now also imagine that we add and remove students on a regular basis...

He and his office manager get frustrated when I explain that the
reports are not simple - in their mind it is. They have determined
the solution is to get a report writer and buy Crystal Reports...
I've tried to explain that the problem is that the report
specification is unclear (basically - they don't understand what they want). The situation is ok for now, I'm just trying to plan for when they figure out that buying Crystal Reports won't change their situation (except they are done several thousand dollars)...

Any tips?

View 20 Replies View Related

Using Historical Attributes With SCD Transform On A Table With A PK

Aug 8, 2007

I must be missing something somewhere...

I have a simple table with three fields: ID, LastName, FirstName. The ID is defined as the PK. In the table is a record of "12345, Smith, John". The incoming flat file has a record of "12345, Smith, Johnny".

In the SCD transform, the ID is the business key, and Last Name and First Name are defined as historical attributes.

During the load, the SCD transform correctly sends the data down the right path, but the insert fails with a primary key violation - as I would expect since it's trying to create a new current record.

How do I get around this problem without removing the PK ???

thx

View 9 Replies View Related

Searching Historical Data For Patterns

Feb 17, 2008



I have a database which contains time series data (historical stock prices) which I have to search for patterns on a day to day basis. But searching this historical data for patterns is very time consuming not only in writing the complex t-sql scripts but also executing them.

Table structure for one min data:
[Date] [Time] [Open] [High], [Low], [Close], [Adjusted_Close], [MA], [DI].....
Tick Data:
[Date] [Time] [Trade]
Most time consuming queries are with lots of inner joins. So for example if I have to compare first few mins data then I have to do inner join like:
With IntervalData AS
(
SELECT [Date], Sum(CASE WHEN 1430 = [Time] THEN [PriceRange] END) AS '1430',
Sum(CASE WHEN 1431 = [Time] THEN [PriceRange] END) AS '1431',
Sum(CASE WHEN 1432 = [Time] THEN [PriceRange] END) AS '1432'
FROM [INDU_1] GROUP BY [Date]
)
SELECT [Date] ,[1430], [1431], [1432], [1431] - [1430] As 'Range' from IntervalData
WHERE ([1430] > 0 AND [1431] < 0 AND [1432] < 0) OR ([1430] < 0 AND [1431] > 0 AND [1430] > 0)
------------------------------------------------------------------------
select ind1.[Time], ind1.PriceRange,ind2.[Time], ind2.PriceRange from INDU_1 ind1
INNER JOIN INDU_1 ind2 ON ind1.[Time] = ind2.[Time] - 1 AND ind1.[Date] = ind2.[Date]
where (ind1.[Time] = 2058) AND ((ind1.PriceRange > 0 AND ind2.PriceRange >0) OR (ind2.PriceRange < 0 AND ind1.PriceRange < 0))
ORDER BY ind1.[Date] DESC;
Is there anyway I can use Sql 2005 Data mining models to make this searching faster?

View 1 Replies View Related

Design For Storing And Querying Historical Data

Aug 2, 2007

I am working on a project, which involves displaying trends of certain aggregate values over time. For example, suppose we want to display how the number of active and inactive users changed over time.

One issue is how to store historical data. First of all, should I create a separate database for each historical snapshot or should I use one database for all snapshots? Second, our database size is a couple of gigabytes and replicating the entire database on a daily basis is not feasible. An alternative solution is to back up aggregate values, but how do I back up results of aggregate queries, where the user can specify a date range in the WHERE-clause? Another solution is to create fact tables from our relational schema and back those up.

Another issue is how to query historical data. Using multiple databases to store historical snapshots makes it harder to query.

As you can see there are several design alternatives and I would like to know how this sort of problem is generally solved in the industry. Does SQL Server provide any support for solving this problem?

Thanks.

View 5 Replies View Related

Deriving Unique Rows From Historical Data

Oct 25, 2005

My application is to capture employee locations.Whenever an employee arrives at a location (whether it is arriving forwork, or at one of the company's other sites) they scan the barcode ontheir employee badge. This writes a record to the tblTSCollected table(DDL and dummy data below).The application needs to be able to display to staff in a control roomthe CURRENT location of each employee.[color=blue]>From the data I've provided, this would be:[/color]EMPLOYEE ID LOCATION CODE963 VB002964 VB003966 VB003968 VB004977 VB001982 VB001Note that, for example, Employee 963 had formerly been at VB001 but wasmore recently logged in at VB002, so therefore the application is notconcerned with the earlier record.What would also be particularly useful would be the NUMBER of staff ateach location - viz.LOCATION CODE NUM STAFFVB001 2VB002 1VB003 2VB004 1Can anyone help?Many thanks in advanceEdwardNOTES ON DDL:THE BARCODE IS CAPTURED BECAUSE THE COMPANY MAY RE-USE BARCODE NUMBERS(WHICH IS DERIVED FROM THE EMPLOYEE PIN), SO THEREFORE THE BARCODECANNOT BE RELIED UPON TO BE UNIQUE.THE COLUMN fldRuleAppliedID IS NULL BECAUSE THAT PARTICULAR ROW HAS NOTBEEN PROCESSED. THERE ARE BUSINESS RULES CONCERNING EMPLOYEE HOURSWHICH OPERATE ON THIS DATA. ONCE A ROW HAS BEEN PROCESSED FORUPLOADING TO THE PAYROLL APPLICATION, THE fldRuleAppliedID COLUMN WILLCONTAIN A VALUE. IN THE PRODUCTION SYSTEM, THEREFORE, ANY SQL ASREQUESTED ABOVE WILL CONTAIN IN ITS WHERE CLAUSE (fldRuleAppliedID IsNULL)if exists (select * from dbo.sysobjects where id =object_id(N'[dbo].[tblTSCollected]') and OBJECTPROPERTY(id,N'IsUserTable') = 1)drop table [dbo].[tblTSCollected]GOCREATE TABLE [dbo].[tblTSCollected] ([fldCollectedID] [int] IDENTITY (1, 1) NOT NULL ,[fldEmployeeID] [int] NULL ,[fldLocationCode] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_ASNULL ,[fldTimeStamp] [datetime] NULL ,[fldRuleAppliedID] [int] NULL ,[fldBarCode] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL) ON [PRIMARY]GOINSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (963, 'VB001', '2005-10-18 11:59:27.383', 45480)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (963, 'VB002', '2005-10-18 12:06:17.833', 45480)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (964, 'VB001', '2005-10-18 12:56:20.690', 45481)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (964, 'VB002', '2005-10-18 15:30:35.117', 45481)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (964, 'VB003', '2005-10-18 16:05:05.880', 45481)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (966, 'VB001', '2005-10-18 11:52:28.307', 97678)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (966, 'VB002', '2005-10-18 13:59:34.807', 97678)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (966, 'VB001', '2005-10-18 14:04:55.820', 97678)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (966, 'VB003', '2005-10-18 16:10:01.943', 97678)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (968, 'VB001', '2005-10-18 11:59:34.307', 98374)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (968, 'VB002', '2005-10-18 12:04:56.037', 98374)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (968, 'VB004', '2005-10-18 12:10:02.723', 98374)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (977, 'VB001', '2005-10-18 12:05:06.630', 96879)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (982, 'VB001', '2005-10-18 12:06:13.787', 96697)

View 4 Replies View Related

Fixed V Changing V Historical Attribute Conflicts

Aug 14, 2007

We have an issue in a SCD where a number of records may be presented that have changes to their attributes of EVERY type.

Example,


BusinessKey: xxxxxxxx

BuildingTypeId: 7

BusinessUnitHistoryId: 4019

BusinessUnitId: 4019

CurrencyId: 26

DevelopmentTypeId: 14

MarketId: 182

Name: abcdefgh

CurrencyId is a fixed attribute
MarketId & BuildingTypeId and the BusinessUnitId & BusinessUnitHistoryIds are historical attributes
Name is a changing attribute

The behaviour of the ETL seems to suggest that if fixed attribute changes are detected, these rows will error and therefore the changing & historical attributes will NOT be amended during the SCD transformation. Is this correct... as it seems to be what is happening.

View 7 Replies View Related

Analysis :: Getting Historical Information From Dimension Table

Aug 17, 2015

Is it possible to design a Cube with a Dimension "DimEmployee" (SCD Type2) and Query the following: What "title" had the person "xyz" at time "2015-01-01"? - See Example data on Images.

I have also a DimTime with Day granularity. The DimEmployee is not at day granularity.

View 9 Replies View Related

SQL DateTime Limit = Enemy Of Historical Based Websites

Jun 18, 2007

I'm trying to create a website based on interesting historical facts, Searchable by date. I need each fact entry to have a datetime field in the database, but unfortunately this method only lets me go as far back as the year 1753 (for an inexplicable reason, I can go up to the year 9999).Is there something I'm overlooking here, or did Microsoft goof this one? Will I have to store my dates as 3 separate custom month/day/year fields? If anybody has a solution for this, please let me know (a B.C./A.D enabled solution isn't at all needed, but I'm open to that too).

View 1 Replies View Related

Historical Table - INSERT And SELECT With Linked Server

Aug 13, 2012

I have a historical table on a dedicated SQL Server (let's call it the reporting db) that is populated every morning with production data that does not already exist. The data in the prod table is purged after 7 days and nothing is ever deleted from the historical table. I have set up the linked server between the two 2008 SQL Servers, but when I try to run this simple query from the reporting DB, it takes more than 5 minutes and still "executing". I eventually have to cancel it:

-- INSERT INTO Temp_Import_historical
SELECT TOP 1 *
FROM [192.168.1.100].ProdDB.dbo.Temp_Import_historical a
WHERE NOT EXISTS (select [Temp_Import_ID] from Temp_Import_historical where a.[Temp_Import_ID] = Temp_Import_historical.[Temp_Import_ID])

I have omitted the INSERT statement on purpose, since I can't even get to output 1 row. Why this is such a resource intensive query?

View 9 Replies View Related

SQL Server 2012 :: How To Load Historical Data From Old System Into A New One

Aug 12, 2014

I want to load historical data from an old system into a new one.Thing is, that old system stored dates as Datetime and the new one uses DateTimeOffset.

All data was collected in the same Time Zone... but with the Daylight Saving Time (DST)

The offset is either +04:00 or +05:00, based on the calendar date. To add to the complexity, the rules for DST changed a couple of years ago.

To determine the offset, I'd need to know what was or would have been the server Timezone for each historical date.

View 1 Replies View Related

SQL Server 2014 :: Combine Data From Historical Table?

Aug 13, 2014

Recently, I partitioned one of my largest tables into multiple monthly field groups. For the current month, it is attached to my "Active' table. The older records are kept in the "historical" table. I need an efficient way to pull records when have a date range that can be spread across both tables.

View 9 Replies View Related

T-SQL (SS2K8) :: Obtaining Counts From Historical Data After Given Date

May 12, 2015

I'm looking to get counts on historical data where the number of records exists on or after May 1 in any given year. I've got the total number of records for each year worked out, but now looking for the number of records exist after a specific date. Here's what I have so far.

SELECT p.FY10,p.FY11,p.FY12,p.FY13,p.FY14,p.FY15
FROM
(
SELECT COUNT(recordID) AS S,
CASE DateFY

[code]...

View 2 Replies View Related

Problem With Historical Prediction In Sales Forecast Model

Aug 2, 2006

Hi,

I have built a time series model to forecast sales value


I have data from jan 2004 to jan 2006 and the sales value is
at a day level in my database. But I am aggregating it to month level in the
DSV of the mining model.



I am required to make only historical predictions using the
above model starting form jan 2004 to jan 2006 for every month.



I have set Historical_Model_Count
and Historical_Model_Gap parameter
values to 24 and 10 respectively, and trying to predict for the past few months
(PredictTImeseries(SalesValue,-1,1))


But its throwing me the following error



Error(Data Mining): A time series
prediction was requested with a start time further in the past than the
internal models of the mining model, Sales Forecast, specified in the
HISTORIC_MODEL_GAP and HISTORIC_MODEL_COUNT parameters can process





In fact it throws the above error irrespective of what the Historical_Model_Count and Historical_Model_Gap parameter values
are





I am not able to figure our why this problem is happening?



What should the parameter values for the above scenario?


It would also be helpful if I can get an explanation on how
these two parameters affect the historical predictions. I kind of understand
that these two parameters are important for historical predictions but don€™t
know why or how.

View 4 Replies View Related

Scd Type 2 Historical Component Not Working When Using Not Null In Columns

Aug 23, 2007



I have a timestamp column and a username column with default values of getdate and system user in my table and they are defined as not null.

even though i donot use these columns in scd type 2 in wizard

if the columns are not null the scd deosnot work and If the columns are defined null they work

Can anyone please explain or help me with what might be the problem associated with this

View 1 Replies View Related

Howto: Historical Prediction In Sales Forecast Model

Aug 3, 2006

Hello...

 

I am new to SSAS and i want to try to build a "Sales" model. I will have some "Usage" data for some timespanns, but I am not quite sure how to tackle this. Is there somewhere a "Howto" for this?

 

Edit: There are several locations, and for each location a forecast is needed. And the Icing would be If I would be able to tell where my supplies must go 1st to achieve the best sales...

The potential Client wants to use Oracle but I would like to show them that SQL Server is the better tool for this ;)

View 1 Replies View Related

SQL Server 2014 :: Moving Old Data Out Into Newly Created Historical DB

Sep 29, 2015

I am getting ready to start a project where I am charged with moving out old data from production into a newly created historical DB. We have about 8 tables that are internal audit tables, that are big and full of old data. These tables are barely used and are taking up way too much space and time for maintenance.

I would like to create a way (SSIS?) to look at the date field in each of the 8 tables and copy out anything older than two years into my newly created history DB. Then deleting the older records from the source DB.

I don't know if SSIS is the best method to use. If it is, what containers to use to move over data, then how to do delete from source?

Can I do the mass deletes on my audit source tables without impacting performance/indexes/fragmentation?

View 5 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved