Posts Tagged ‘exponential smoothing’

Forecast Friday Topic: Double Exponential Smoothing

May 20, 2010

(Fifth in a series)

We pick up on our discussion of exponential smoothing methods, focusing today on double exponential smoothing. Single exponential smoothing, which we discussed in detail last week, is ideal when your time series is free of seasonal or trend components, which create patterns that your smoothing equation would miss due to lags. Single exponential smoothing produces forecasts that exceed actual results when the time series exhibits a decreasing linear trend, and forecasts that trail actual results when the time series exhibits an increasing trend. Double exponential smoothing takes care of this problem.

Two Smoothing Constants, Three Equations

Recall the equation for single exponential smoothing:

Ŷt+1 = αYt + (1-α) Ŷt

Where: Ŷt+1 represents the forecast value for period t + 1

Yt is the actual value of the current period, t

Ŷt is the forecast value for the current period, t

and α is the smoothing constant, or alpha, 0≤ α≤ 1

To account for a trend component in the time series, double exponential smoothing incorporates a second smoothing constant, beta, or β. Now, three equations must be used to create a forecast: one to smooth the time series, one to smooth the trend, and one to combine the two equations to arrive at the forecast:

Ct = αYt + (1-α)(Ct-1 + T t-1)

Tt = β(Ct - Ct-1) + (1 – β)T t-1

Ŷt+1 = Ct + Tt

All symbols appearing in the single exponential smoothing equation represent the same in the double exponential smoothing equation, but now β is the trend-smoothing constant (whereas α is the smoothing constant for a stationary – constant – process) also between 0 and 1; Ct is the smoothed constant process value for period t; and Tt is the smoothed trend value for period t.

As with single exponential smoothing, you must select starting values for Ct and Tt, as well as values for α and β. Recall that these processes are judgmental, and constants closer to a value of 1.0 are chosen when less smoothing is desired (and more weight placed on recent values) and constants closer to 0.0 when more smoothing is desired (and less weight placed on recent values).

An Example

Let’s assume you’ve got 12 months of sales data, shown in the table below:

Month t

Sales Yt

1

152

2

176

3

160

4

192

5

220

6

272

7

256

8

280

9

300

10

280

11

312

12

328

You want to see if there is any discernable trend, so you plot your sales on the chart below:

The time series exhibits an increasing trend. Hence, you must use double exponential smoothing. You must first select your initial values for C and T. One way to do that is to again assume that the first value is equal to its forecast. Using that as the starting point, you set C2 = Y1, or 152. Then you subtract Y1 from Y2 to get T2: T2 = Y2 – Y1 = 24. Hence, at the end of period 2, your forecast for period 3 is 176 (Ŷ3 = 152 + 24).

Now you need to choose α and β. For the purposes of this example, we will choose an α of 0.20 and a β of 0.30. Actual sales in period 3 were 160, and our constant-smoothing equation is:

C3 = 0.20(160) + (1 – 0.20)(152 + 24)

= 32 + 0.80(176)

= 32 + 140.8

= 172.8

Next, we compute the trend value with our trend-smoothing equation:

T3 = 0.30(172.8 – 152) + (1 – 0.30)(24)

= 0.30(20.8) + 0.70(24)

= 6.24 + 16.8

=23.04

Hence, our forecast for period 4 is:

Ŷ4 = 172.8 + 23.04

= 195.84

Then, carrying out your forecasts for the 12-month period, you get the following table:

     

Alpha=

0.2

Beta=

0.3

Month t

Sales Yt

Ct

Tt

Ŷt

Absolute Deviation

1

152

       

2

176

152.00

24.00

152.00

 

3

160

172.80

23.04

176.00

16.00

4

192

195.07

22.81

195.84

3.84

5

220

218.31

22.94

217.88

2.12

6

272

247.39

24.78

241.24

30.76

7

256

268.94

23.81

272.18

16.18

8

280

290.20

23.05

292.75

12.75

9

300

310.60

22.25

313.25

13.25

10

280

322.28

19.08

332.85

52.85

11

312

335.49

17.32

341.36

29.36

12

328

347.85

15.83

352.81

24.81

       

MAD=

20.19

 

Notice a couple of things: the absolute deviation is the absolute value of the difference between Yt (shown in lavender) and Ŷt (shown in light blue). Note also that beginning with period 3, Ŷ3 is really the sum of C and T computed in period 2. That’s because period 3’s constant and trend forecasts were generated at the end of period 2 – and onward until period 12. Mean Absolute Deviation has been computed for you. As with our explanation of single exponential smoothing, you need to experiment with the smoothing constants to find a balance that most accurate forecast at the lowest possible MAD.

Now, we need to forecast for period 13. That’s easy. Add C12 and T12:

Ŷ13 = 347.85 + 15.83

= 363.68

And, your chart comparing actual vs. forecasted sales is:

As with single exponential smoothing, you see that your forecasted curve is smoother than your actual curve. Notice also how small the gaps are between the actual and forecasted curves. The fit’s not bad.

Exponential Smoothing Recap

Now let’s recap our discussion on exponential smoothing:

  1. Exponential smoothing methods are recursive, that is, they rely on all observations in the time series. The weight on each observation diminishes exponentially the more distant in the past it is.
  2. Smoothing constants are used to assign weights – between 0 and 1 – to the most recent observations. The closer the constant is to 0, the more smoothing that occurs and the lighter the weight assigned to the most recent observation; the closer the constant is to 1, the less smoothing that occurs and the heavier the weight assigned to the most recent observation.
  3. When no discernable trend is exhibited in the data, single exponential smoothing is appropriate; when a trend is present in the time series, double exponential smoothing is necessary.
  4. Exponential smoothing methods require you to generate starting forecasts for the first period in the time series. Deciding on those initial forecasts, as well as on the values of your smoothing constants – alpha and beta – are arbitrary. You need to base your judgments on your experience in the business, as well as some experimentation.
  5. Exponential smoothing models do not forecast well when the time series pattern (e.g., level of sales) is suddenly, drastically, and permanently altered by some event or change of course or action. In these instances, a new model will be necessary.
  6. Exponential smoothing methods are best used for short-term forecasting.

Next Week’s Forecast Friday Topic: Regression Analysis (Our Series within the Series!)

Next week, we begin a multi-week discussion of regression analysis. We will be setting up the next few weeks with a discussion of the principles of ordinary least squares regression (OLS), and then discussions of its use as a time-series forecasting approach, and later as a causal/econometric approach. During the course of the next few Forecast Fridays, we will discuss the issues that occur with regression: specification bias, autocorrelation, heteroscedasticity, and multicollinearity, to name a few. There will be some discussions on how to detect – and correct – these violations. Once the regression analysis miniseries is complete, we will be set up to discuss ARMA and ARIMA models, which will be written by guest bloggers who are well-experienced in those approaches. We know you’ll be very pleased with the weeks ahead!

Still don’t know why our Forecast Friday posts appear on Thursday? Find out at: http://tinyurl.com/26cm6ma

Forecast Friday Topic: Exponential Smoothing Methods

May 13, 2010

(Fourth in a series)

In last week’s Forecast Friday post, we discussed moving average forecasting methods, both simple and weighted. When a time series is stationary, that is, exhibits no discernable trend or seasonality and is subject only to the randomness of everyday existence, then moving average methods – or even a simple average of the entire series – are useful for forecasting the next few periods. However, most time series are anything but stationary: retail sales have trend, seasonal, and cyclical elements, while public utilities have trend and seasonal components that impact the usage of electricity and heat. Hence, moving average forecasting approaches may provide less than desirable results. Moreover, the most recent sales figures typically are more indicative of future sales, so there is often a need to have a forecasting system that places greater weight on more recent observations. Enter exponential smoothing.

Unlike moving average models, which use a fixed number of the most recent values in the time series for smoothing and forecasting, exponential smoothing incorporates all values time series, placing the heaviest weight on the current data, and weights on older observations that diminish exponentially over time. Because of the emphasis on all previous periods in the data set, the exponential smoothing model is recursive. When a time series exhibits no strong or discernable seasonality or trend, the simplest form of exponential smoothing – single exponential smoothing – can be applied. The formula for single exponential smoothing is:

Ŷt+1 = αYt + (1-α) Ŷt

In this equation, Ŷt+1 represents the forecast value for period t + 1; Yt is the actual value of the current period, t; Ŷt is the forecast value for the current period, t; and α is the smoothing constant, or alpha, a number between 0 and 1. Alpha is the weight you assign to the most recent observation in your time series. Essentially, you are basing your forecast for the next period on the actual value for this period, and the value you forecasted for this period, which in turn was based on forecasts for periods before that.

Let’s assume you’ve been in business for 10 weeks and want to forecast sales for the 11th week. Sales for those first 10 weeks are:

Week (t)

Sales (Yt)

1

200

2

215

3

210

4

220

5

230

6

220

7

235

8

215

9

220

10

210

From the equation above, you know that in order to come up with a forecast for week 11, you need forecasted values for weeks 10, 9, and all the way down to week 1. You also know that week 1 does not have any preceding period, so it cannot be forecasting. And, you need to determine the smoothing constant, or alpha, to use for your forecasts.

Determining the Initial Forecast

The first step in constructing your exponential smoothing model is to generate a forecast value for the first period in your time series. The most common practice is to set the forecasted value of week 1 equal to the actual value, 200, which we will do in our example. Another approach would be that if you have prior sales data to this, but are not using it in your construction of the model, you might take an average of a couple of immediately prior periods and use that as the forecast. How you determine your initial forecast is subjective.

How Big Should Alpha Be?

This too is a judgment call, and finding the appropriate alpha is subject to trial and error. Generally, if your time series is very stable, a small α is appropriate. Visual inspection of your sales on a graph is also useful in trying to pinpoint an alpha to start with. Why is the size of α important? Because the closer α is to 1, the more weight that is assigned to the most recent value in determining your forecast, the more rapidly your forecast adjusts to patterns in your time series and the less smoothing that occurs. Likewise, the closer α is to 0, the more weight that is placed on earlier observations in determining the forecast, the more slowly your forecast adjusts to patterns in the time series, and the more smoothing that occurs. Let’s visually inspect the 10 weeks of sales:

The Exponential Smoothing Process

The sales appear somewhat jagged, oscillating between 200 and 235. Let’s start with an alpha of 0.5. That gives us the following table:

Week (t)

Sales (Yt)

Forecast for This Period (Ŷt)

1

200

200.0

2

215

200.0

3

210

207.5

4

220

208.8

5

230

214.4

6

220

222.2

7

235

221.1

8

215

228.0

9

220

221.5

10

210

220.8

Notice how, even though your forecasts aren’t precise, when your actual value for a particular week is higher than what you forecasted (weeks 2 through 5, for example), your forecasts for each of the subsequent weeks (weeks 3 through 6) adjust upward; when your actual values are lower than your forecast (e.g., weeks 6, 8, 9, and 10), your forecasts for the following week adjusts downward. Also notice that, as you move to later periods, your earlier forecasts play less and less of a role in your later forecasts, as their weight diminishes exponentially. Just by looking at the table above, you know that the forecast for week 11 will be lower than 220.8, your forecast for week 10:

Ŷ11 = 0.5Y10 + (1-0.5) Ŷ10

= 0.5(210) + 0.5(220.8)

= 105 + 110.4

=215.4

So, based on our alpha and our past sales, our best guess is that sales in week 11 will be 215.4. Take a look at the graph of actual vs. forecasted sales for weeks 1-10:

Notice that the forecasted sales are smoother than actual, and you can see how the forecasted sales line adjusts to spikes and dips in the actual sales time series.

What if we Had Used a Smaller or Larger Alpha?

We’ll demonstrate by using both an alpha of .30 and one of .70. That gives us the following table and graph:

Week (t)

Sales (Yt)

Forecast α=0.50

Forecast α=0.30

Forecast α=0.70

1

200

200.0

200.0

200.0

2

215

200.0

200.0

200.0

3

210

207.5

204.5

210.5

4

220

208.8

206.2

210.2

5

230

214.4

210.3

217.0

6

220

222.2

216.2

226.1

7

235

221.1

217.3

221.8

8

215

228.0

222.6

231.1

9

220

221.5

220.4

219.8

10

210

220.8

220.2

219.9

 

As you can see, the smaller the α, the smoother the curve for forecasted sales; the larger the α, the bumpier the curve, as you can see as you move from .30 to .50 to .70. Notice how much faster an α of .70 adjusts to the actual sales than the smaller α’s. The forecasts for week 11 would be 217.2 with an α=.30 and 213 with an α=.70.

Which α is best?

As with moving average models, the Mean Absolute Deviation (MAD) can be used to determining which alpha best fits the data. The MADs for each alpha are computed below:

Week

Absolute Deviations

α=.30

α=.50

α=.70

1

-

-

-

2

15.0

15.0

15.0

3

5.5

2.5

0.5

4

13.9

11.3

9.8

5

19.7

15.6

13.0

6

3.8

2.2

6.1

7

17.7

13.9

13.2

8

7.6

13.0

16.1

9

0.4

1.5

0.2

10

10.2

10.8

9.9

MAD=

9.4

8.6

8.4

 

Using an alpha of 0.70, we end up with the lowest MAD of the three constants. Keep in mind that judging the dependability of forecasts isn’t always about minimizing MAD. MAD, after all, is an average of deviations. Notice how dramatically the absolute deviations for each of the alphas change from week to week. Forecasts might be more reliable using an alpha that produces a higher MAD, but has less variance among its individual deviations.

Limits on Exponential Smoothing

Exponential smoothing is not intended for long-term forecasting. Usually it is used to predict one or two, but rarely more than three periods ahead. Also, if there is a sudden drastic change in the level of sales or values, and the time series continues at that new level, then the algorithm will be slow to catch up with the sudden change. Hence, there will be greater forecasting error. In situations like that, it would be best to ignore the previous periods before the change, and begin the exponential smoothing process with the new level. Finally, this post discussed single exponential smoothing, which is used when there is no noticeable seasonality or trend in the data. When there is a noticeable trend or seasonal pattern in the data, single exponential smoothing will yield significant forecast error. Double exponential smoothing is needed here to adjust for those patterns. We will cover double exponential smoothing in next week’s Forecast Friday post.

Still don’t know why our Forecast Friday posts appear on Thursday? Find out at: http://tinyurl.com/26cm6ma

 

 

 

A Little Housekeeping

May 12, 2010

This is just a housekeeping post.

Contact Us form glitch on Analysights Website corrected

If you had gone to the Contact Us page on our Website, filled out the form for more information, and then clicked “Submit”, your form would have bounced back, because of an email glitch.  That has now been corrected, so if you wish to contact us, please return to the Contact Us Page and resubmit your form.  We apologize for this inconvenience.

Next Forecast Friday Topic: Exponential Smoothing Methods

On Thursday, we deliver our forth post in our Forecast Friday series.  This topic will be on exponential smoothing, an approach that places the highest weight on the most recent observation, and (exponentially) decreasing weights on each earlier observation when making forecasts.  Exponential smoothing is a little more challenging than the simple moving average methods discussed in last week’s post, but the results are often superior.  We hope you’ll find the discussion on exponential smoothing helpful, informative, and actionable.

Forecast Friday Topic: Moving Average Methods

May 6, 2010

(Third in a series)

One of the easiest, most common time series forecasting techniques is that of the moving average.  Moving average methods come in handy if all you have is several consecutive periods of the variable (e.g., sales, new savings accounts opened, workshop attendees, etc.) you’re forecasting, and no other data to predict what the next period’s value will be.  Often, using the past few months of sales to predict the coming month’s sales is preferable to unaided estimates.  However, moving average methods can have serious forecasting errors if applied carelessly.

Moving Averages: The Method

Essentially, moving averages try to estimate the next period’s value by averaging the value of the last couple of periods immediately prior.  Let’s say that you have been in business for three months, January through March, and wanted to forecast April’s sales.  Your sales for the last three months look like this:

Month Sales ($000)
January

129

February

134

March

122

 

The simplest approach would be to take the average of January through March and use that to estimate April’s sales:

 (129 + 134 + 122)/3 = $128.333

Hence, based on the sales of January through March, you predict that sales in April will be $128,333.   Once April’s actual sales come in, you would then compute the forecast for May, this time using February through April. You must be consistent with the number of periods you use for moving average forecasting.

 The number of periods you use in your moving average forecasts are arbitrary; you may use only two-periods, or five or six periods – whatever you desire – to generate your forecasts.

The approach above is a simple moving average. Sometimes, more recent months’ sales may be stronger influencers of the coming month’s sales, so you want to give those nearer months more weight in your forecast model. This is a weighted moving average. And just like the number of periods, the weights you assign are purely arbitrary. Let’s say you wanted to give March’s sales 50% weight, February’s 30% weight, and January’s 20%. Then your forecast for April will be $127,000 [(122*.50) + (134*.30) + (129*.20) = 127].
 

Limitations of Moving Average Methods 
Moving averages are considered a “smoothing” forecast technique. Because you’re taking an average over time, you are softening (or smoothing out) the effects of irregular occurrences within the data. As a result, the effects of seasonality, business cycles, and other random events can dramatically increase forecast error. Take a look at a full year’s worth of data, and compare a 3-period moving average and a 5-period moving average:
 

Month Sales ($000) 3-Mo. Moving Average 5-Mo. Moving Average
January

129

   
February

134

128.3

 
March

122

127.0

128.2

April

125

126.0

129.8

May

131

131.0

128.6

June

137

132.0

130.4

July

128

132.0

129.2

August

131

126.0

127.8

September

119

124.7

126.0

October

124

123.7

127.6

November

128

129.3

 
December

136

   

 Notice that in this instance that I did not create forecasts, but rather centered the moving averages. The first 3-month moving average is for February, and it’s the average of January, February, and March. I also did similar for the 5-month average. Now take a look at the following chart:


 What do you see? Is not the three-month moving average series much smoother than the actual sales series? And how about the five-month moving average? It’s even smoother. Hence, the more periods you use in your moving average, the smoother your time series. Hence, for forecasting, a simple moving average may not be the most accurate method. Moving average methods do prove quite valuable when you’re trying to extract the seasonal, irregular, and cyclical components of a time series for more advanced forecasting methods, like regression and ARIMA, and the use of moving averages in decomposing a time series will be addressed later in the series.

Determining the Accuracy of a Moving Average Model

 Generally, you want a forecasting method that has the least error between actual and predicted results. One of the most common measures of forecast accuracy is the Mean Absolute Deviation (MAD). In this approach, for each period in the time series for which you generated a forecast, you take the absolute value of the difference between that period’s actual and forecasted values (the deviation). Then you average those absolute deviations and you get a measure of MAD. MAD can be helpful in deciding on the number of periods you average, and/or the amount of weight you place on each period. Generally, you pick the one that results in the lowest MAD. Here’s an example of how MAD is calculated:

Month Actual 3-Mo. Forecast Deviation Absolute Deviation
January

135

127

(8)

8

February

134

135

1

1

March

125

128

3

3

     

MAD=

4

MAD is simply the average of 8, 1, and 3.

Moving Averages: Recap
When using moving averages for forecasting, remember:

  1. Moving averages can be simple or weighted;
  2. The number of periods you use for your average, and any weights you assign to each are strictly arbitrary;
  3. Moving averages smooth out irregular patterns in time series data; the larger the number of periods used for each data point, the greater the smoothing effect;
  4. Because of smoothing, forecasting next month’s sales based on the most recent few month’s sales can result in large deviations because of seasonality, cyclical, and irregular patterns in the data; and
  5. The smoothing capabilities of a moving average method can be useful in decomposing a time series for more advanced forecasting methods.

Next Week: Exponential Smoothing
In next week’s Forecast Friday, we will discuss exponential smoothing methods, and you will see that they can be far superior to moving average forecasting methods.

Still don’t know why our Forecast Friday posts appear on Thursday? Find out at: http://tinyurl.com/26cm6ma

Objective and Subjective Forecasting Approaches

May 3, 2010

(second in a series)

Today we discuss the various categories of forecasting methods that are available to businesses.  Forecasting methods can be either objective (using quantitative approaches) or subjective (using more intuitive or qualitative approaches), depending on what data is available and the distance into the future for which a forecast is desired.  Forecasting approaches will typically be more objective for nearer term forecasting horizons and for events where there is plenty of quantitative data available.  More distant time periods, or events with a lack of historical quantitative data will often call for more subjective approaches.  We will discuss these two classes of forecasting methods, and the categories within each.

Objective Forecasting Approaches  

Objective forecasting approaches are quantitative in nature and lend themselves well to an abundance of data.  There are three categories of objective forecasting methods: time series, causal/econometric,  and artificial intelligence.  AI approaches are outside my experience, so I won’t be covering them in this series, but mention them as another alternative, in case you wish to investigate them on your own. 

Time Series Methods

Time series methods attempt to estimate future outcomes on the basis of historical data.  In many cases, prior sales of a product can be a good predictor of upcoming sales because of prior period marketing efforts, repeat business, brand awareness, and other factors.  When an analyst employs time series methods, he/she is assuming that the future will continue to look like the past.  In rapidly changing industries or environments, time series forecasts are not ideal, and may be useless.

Because time series data are historical, they exhibit four components that emerge over time: trend, seasonal, cyclical, and random (or irregular).  Before any forecasting is done on time series data, the data must be adjusted for each of these components.  Decomposing time series data will be discussed later in this series. 

The most common time series methods include moving average (both straight and weighted), exponential smoothing, and regression analysis.  Each of these approaches will be discussed later in the series.

Causal/Econometric Methods

Causal or econometric forecasting methods attempt to predict outcomes based on changes in factors that are known – or believed – to impact those outcomes.  For example, temperature may be used to forecast sales of ice cream; advertising expenditures may be used to predict sales; or the unemployment rate might be used to forecast the incidence of crime in a neighborhood.  It is important to note, however, that just because a model finds two events that are correlated (e.g., occur together), it does not necessarily mean that one event has caused the other.

Regression analysis also falls under the causal/econometric umbrella, as it can be used to predict an outcome based on changes in other factors (e.g., SAT score may be used to measure likelihood of being accepted to a college).  Econometric forecasting methods include  Autoregressive Moving Average (ARMA) and Autoregressive Integrated Moving Average (ARIMA) models.  ARIMA was previously known as Box-Jenkins.  ARMA and ARIMA models are used in certain cases, but most of the time are unnecessary.  Although these two methods won’t be covered in much depth later in the series, there will be a brief description of them and when they are needed.

Subjective Forecasting Approaches

Subjective forecasts are more qualitative.  These approaches rely most heavily on judgment and educated guesses, since there is little data available for forecasting.  This is especially the case in long-range forecasting.  It’s easy to forecast next week’s sales of ice cream – and possibly even of individual flavors, since you’ll likely have months or years of past weekly ice cream sales data.  However, if you’re trying to get an idea of what ice cream consumption or flavor preferences will be 10 years from now, quantitative approaches will be of little use.  Changes in tastes, technology, and political, economic, and social factors occur and can dramatically alter the course of trends.  Hence, the opinion of subject matter experts is often called upon.  There is essentially only one category of subjective forecast approaches – and it is rightly called “Judgmental” forecasts.

Judgmental Methods

Judgmental forecasting methods rely much on expert opinion and educated guesses.  But just because they have little quantitative or objective basis doesn’t mean they should be dismissed or not measured for accuracy.  The most common types of of judgmental forecasting methods are composite forecasts, extrapolation, surveys, Delphi method, scenario writing, and simulation.  Each of these methods will be discussed in detail later in the series. 

Introducing “Forecast Fridays” – ON THURSDAYS!!!

Beginning with part 3, which will discuss moving average forecasts, the forecasting series will begin posting weekly so that the remaining days of the week can still be devoted to other topics in the marketing research and analytics field.  The weekly post will be called “Forecast Friday.”  However, it will be posted every Thursday!  Why?  Find out in tomorrow’s post!


Follow

Get every new post delivered to your Inbox.

Join 87 other followers