## Posts Tagged ‘ARMA’

### Forecast Friday Changes; Resumes February 3

January 17, 2011

We’re currently in the phase of the Forecast Friday series that discusses ARIMA models. This week’s post was to discuss the autoregressive (AR), moving average (MA) and autoregressive moving average (ARMA) models, and then the posts for the next three weeks would delve into ARIMA models. Given the complexity of the topic, along with increasing client load at Analysights, I no longer have the time to cover this topic in the detail it requires. Therefore, I have decided pull ARIMA out of the series. Forecast Friday will resume February 3, when we will begin our discussion of judgmental forecasting methods.

For those of you interested in learning about ARIMA, I invite you to check out some resources that have helped me through college and graduate school:

1. Introductory Business & Economic Forecasting, 2nd Edition. Newbold, P. and Bos, T., Chapter 7.
2. Forecasting Methods and Applications,3rd Edition. Makridakis,S., Wheelwright, S. and Hyndman, R., Chapters 7-8.
3. Introducing Econometrics. Brown, W., Chapter 9.

I apologize for this inconvenience, and thank you for your understanding.

Alex

### Forecast Friday Topic: Double Exponential Smoothing

May 20, 2010

(Fifth in a series)

We pick up on our discussion of exponential smoothing methods, focusing today on double exponential smoothing. Single exponential smoothing, which we discussed in detail last week, is ideal when your time series is free of seasonal or trend components, which create patterns that your smoothing equation would miss due to lags. Single exponential smoothing produces forecasts that exceed actual results when the time series exhibits a decreasing linear trend, and forecasts that trail actual results when the time series exhibits an increasing trend. Double exponential smoothing takes care of this problem.

Two Smoothing Constants, Three Equations

Recall the equation for single exponential smoothing:

Ŷt+1 = αYt + (1-α) Ŷt

Where: Ŷt+1 represents the forecast value for period t + 1

Yt is the actual value of the current period, t

Ŷt is the forecast value for the current period, t

and α is the smoothing constant, or alpha, 0≤ α≤ 1

To account for a trend component in the time series, double exponential smoothing incorporates a second smoothing constant, beta, or β. Now, three equations must be used to create a forecast: one to smooth the time series, one to smooth the trend, and one to combine the two equations to arrive at the forecast:

Ct = αYt + (1-α)(Ct-1 + T t-1)

Tt = β(Ct – Ct-1) + (1 – β)T t-1

Ŷt+1 = Ct + Tt

All symbols appearing in the single exponential smoothing equation represent the same in the double exponential smoothing equation, but now β is the trend-smoothing constant (whereas α is the smoothing constant for a stationary – constant – process) also between 0 and 1; Ct is the smoothed constant process value for period t; and Tt is the smoothed trend value for period t.

As with single exponential smoothing, you must select starting values for Ct and Tt, as well as values for α and β. Recall that these processes are judgmental, and constants closer to a value of 1.0 are chosen when less smoothing is desired (and more weight placed on recent values) and constants closer to 0.0 when more smoothing is desired (and less weight placed on recent values).

An Example

Let’s assume you’ve got 12 months of sales data, shown in the table below:

 Month t Sales Yt 1 152 2 176 3 160 4 192 5 220 6 272 7 256 8 280 9 300 10 280 11 312 12 328

You want to see if there is any discernable trend, so you plot your sales on the chart below:

The time series exhibits an increasing trend. Hence, you must use double exponential smoothing. You must first select your initial values for C and T. One way to do that is to again assume that the first value is equal to its forecast. Using that as the starting point, you set C2 = Y1, or 152. Then you subtract Y1 from Y2 to get T2: T2 = Y2 – Y1 = 24. Hence, at the end of period 2, your forecast for period 3 is 176 (Ŷ3 = 152 + 24).

Now you need to choose α and β. For the purposes of this example, we will choose an α of 0.20 and a β of 0.30. Actual sales in period 3 were 160, and our constant-smoothing equation is:

C3 = 0.20(160) + (1 – 0.20)(152 + 24)

= 32 + 0.80(176)

= 32 + 140.8

= 172.8

Next, we compute the trend value with our trend-smoothing equation:

T3 = 0.30(172.8 – 152) + (1 – 0.30)(24)

= 0.30(20.8) + 0.70(24)

= 6.24 + 16.8

=23.04

Hence, our forecast for period 4 is:

Ŷ4 = 172.8 + 23.04

= 195.84

Then, carrying out your forecasts for the 12-month period, you get the following table:

 Alpha= 0.2 Beta= 0.3 Month t Sales Yt Ct Tt Ŷt Absolute Deviation 1 152 2 176 152.00 24.00 152.00 3 160 172.80 23.04 176.00 16.00 4 192 195.07 22.81 195.84 3.84 5 220 218.31 22.94 217.88 2.12 6 272 247.39 24.78 241.24 30.76 7 256 268.94 23.81 272.18 16.18 8 280 290.20 23.05 292.75 12.75 9 300 310.60 22.25 313.25 13.25 10 280 322.28 19.08 332.85 52.85 11 312 335.49 17.32 341.36 29.36 12 328 347.85 15.83 352.81 24.81 MAD= 20.19

Notice a couple of things: the absolute deviation is the absolute value of the difference between Yt (shown in lavender) and Ŷt (shown in light blue). Note also that beginning with period 3, Ŷ3 is really the sum of C and T computed in period 2. That’s because period 3’s constant and trend forecasts were generated at the end of period 2 – and onward until period 12. Mean Absolute Deviation has been computed for you. As with our explanation of single exponential smoothing, you need to experiment with the smoothing constants to find a balance that most accurate forecast at the lowest possible MAD.

Now, we need to forecast for period 13. That’s easy. Add C12 and T12:

Ŷ13 = 347.85 + 15.83

= 363.68

And, your chart comparing actual vs. forecasted sales is:

As with single exponential smoothing, you see that your forecasted curve is smoother than your actual curve. Notice also how small the gaps are between the actual and forecasted curves. The fit’s not bad.

Exponential Smoothing Recap

Now let’s recap our discussion on exponential smoothing:

1. Exponential smoothing methods are recursive, that is, they rely on all observations in the time series. The weight on each observation diminishes exponentially the more distant in the past it is.
2. Smoothing constants are used to assign weights – between 0 and 1 – to the most recent observations. The closer the constant is to 0, the more smoothing that occurs and the lighter the weight assigned to the most recent observation; the closer the constant is to 1, the less smoothing that occurs and the heavier the weight assigned to the most recent observation.
3. When no discernable trend is exhibited in the data, single exponential smoothing is appropriate; when a trend is present in the time series, double exponential smoothing is necessary.
4. Exponential smoothing methods require you to generate starting forecasts for the first period in the time series. Deciding on those initial forecasts, as well as on the values of your smoothing constants – alpha and beta – are arbitrary. You need to base your judgments on your experience in the business, as well as some experimentation.
5. Exponential smoothing models do not forecast well when the time series pattern (e.g., level of sales) is suddenly, drastically, and permanently altered by some event or change of course or action. In these instances, a new model will be necessary.
6. Exponential smoothing methods are best used for short-term forecasting.

Next Week’s Forecast Friday Topic: Regression Analysis (Our Series within the Series!)

Next week, we begin a multi-week discussion of regression analysis. We will be setting up the next few weeks with a discussion of the principles of ordinary least squares regression (OLS), and then discussions of its use as a time-series forecasting approach, and later as a causal/econometric approach. During the course of the next few Forecast Fridays, we will discuss the issues that occur with regression: specification bias, autocorrelation, heteroscedasticity, and multicollinearity, to name a few. There will be some discussions on how to detect – and correct – these violations. Once the regression analysis miniseries is complete, we will be set up to discuss ARMA and ARIMA models, which will be written by guest bloggers who are well-experienced in those approaches. We know you’ll be very pleased with the weeks ahead!

Still don’t know why our Forecast Friday posts appear on Thursday? Find out at:

### Objective and Subjective Forecasting Approaches

May 3, 2010

(second in a series)

Today we discuss the various categories of forecasting methods that are available to businesses.  Forecasting methods can be either objective (using quantitative approaches) or subjective (using more intuitive or qualitative approaches), depending on what data is available and the distance into the future for which a forecast is desired.  Forecasting approaches will typically be more objective for nearer term forecasting horizons and for events where there is plenty of quantitative data available.  More distant time periods, or events with a lack of historical quantitative data will often call for more subjective approaches.  We will discuss these two classes of forecasting methods, and the categories within each.

Objective Forecasting Approaches

Objective forecasting approaches are quantitative in nature and lend themselves well to an abundance of data.  There are three categories of objective forecasting methods: time series, causal/econometric,  and artificial intelligence.  AI approaches are outside my experience, so I won’t be covering them in this series, but mention them as another alternative, in case you wish to investigate them on your own.

Time Series Methods

Time series methods attempt to estimate future outcomes on the basis of historical data.  In many cases, prior sales of a product can be a good predictor of upcoming sales because of prior period marketing efforts, repeat business, brand awareness, and other factors.  When an analyst employs time series methods, he/she is assuming that the future will continue to look like the past.  In rapidly changing industries or environments, time series forecasts are not ideal, and may be useless.

Because time series data are historical, they exhibit four components that emerge over time: trend, seasonal, cyclical, and random (or irregular).  Before any forecasting is done on time series data, the data must be adjusted for each of these components.  Decomposing time series data will be discussed later in this series.

The most common time series methods include moving average (both straight and weighted), exponential smoothing, and regression analysis.  Each of these approaches will be discussed later in the series.

Causal/Econometric Methods

Causal or econometric forecasting methods attempt to predict outcomes based on changes in factors that are known – or believed – to impact those outcomes.  For example, temperature may be used to forecast sales of ice cream; advertising expenditures may be used to predict sales; or the unemployment rate might be used to forecast the incidence of crime in a neighborhood.  It is important to note, however, that just because a model finds two events that are correlated (e.g., occur together), it does not necessarily mean that one event has caused the other.

Regression analysis also falls under the causal/econometric umbrella, as it can be used to predict an outcome based on changes in other factors (e.g., SAT score may be used to measure likelihood of being accepted to a college).  Econometric forecasting methods include  Autoregressive Moving Average (ARMA) and Autoregressive Integrated Moving Average (ARIMA) models.  ARIMA was previously known as Box-Jenkins.  ARMA and ARIMA models are used in certain cases, but most of the time are unnecessary.  Although these two methods won’t be covered in much depth later in the series, there will be a brief description of them and when they are needed.

Subjective Forecasting Approaches

Subjective forecasts are more qualitative.  These approaches rely most heavily on judgment and educated guesses, since there is little data available for forecasting.  This is especially the case in long-range forecasting.  It’s easy to forecast next week’s sales of ice cream – and possibly even of individual flavors, since you’ll likely have months or years of past weekly ice cream sales data.  However, if you’re trying to get an idea of what ice cream consumption or flavor preferences will be 10 years from now, quantitative approaches will be of little use.  Changes in tastes, technology, and political, economic, and social factors occur and can dramatically alter the course of trends.  Hence, the opinion of subject matter experts is often called upon.  There is essentially only one category of subjective forecast approaches – and it is rightly called “Judgmental” forecasts.

Judgmental Methods

Judgmental forecasting methods rely much on expert opinion and educated guesses.  But just because they have little quantitative or objective basis doesn’t mean they should be dismissed or not measured for accuracy.  The most common types of of judgmental forecasting methods are composite forecasts, extrapolation, surveys, Delphi method, scenario writing, and simulation.  Each of these methods will be discussed in detail later in the series.

Introducing “Forecast Fridays” – ON THURSDAYS!!!

Beginning with part 3, which will discuss moving average forecasts, the forecasting series will begin posting weekly so that the remaining days of the week can still be devoted to other topics in the marketing research and analytics field.  The weekly post will be called “Forecast Friday.”  However, it will be posted every Thursday!  Why?  Find out in tomorrow’s post!