Smoothing with Exponentially Weighted Moving Averages

A moving average takes a noisy time series and replaces each value with the average value of a neighborhood about the given value. This neighborhood may consist of purely historical data, or it may be centered about the given value. Furthermore, the values in the neighborhood may be weighted using different sets of weights. Here is an example of an equally weighted three point moving average, using historical data,

(1)    \begin{align*}   s_{0} &= x_{0} \   s_{1} &= \dfrac{x_{0}+x_{1}}{2} \   s_{t} &= \dfrac{x_{t-2}+x_{t-1}+x_{t}}{3}, \quad \mbox{for t \textgreater  1}. \end{align*}

Here, {s_{t}} represents the smoothed signal, and {x_{t}} represents the noisy time series. In contrast to simple moving averages, an exponentially weighted moving average (EWMA) adjusts a value according to an exponentially weighted sum of all previous values. This is the basic idea,

(2)    \begin{align*}   s_{0} &= x_{0} \   s_{t} &= \alpha x_{t} + ( 1 - \alpha ) s_{t-1}, \quad \mbox{for t \textgreater 0}. \end{align*}

This is nice because you don’t have to worry about having a three point window, versus a five point window, or worry about the appropriateness of your weighting scheme. With the EWMA, previous perturbations “remembered,” and “slowly forgotten,” by the s_{t-1} term in the last equation, whereas with a window or neighborhood with discrete boundaries, a perturbation is forgotten as soon as it passes out of the window.

Averaging the EWMA to Accommodate Trends

After reading about EWMAs in a data analysis book, I had gone along happily using this tool on every single smoothing application that I came across. It was not until later that I learned that the EWMA function is really only appropriate for stationary data, i.e., data without trends or seasonality. In particular, the EWMA function resists trends away from the current mean that it’s already “seen”. So, if you have a noisy hat function that goes from 0, to 1, and then back to 0, then the EWMA function will return low values on the up-hill side, and high values on the down-hill side. One way to circumvent this is to smooth the signal in both directions, marching forward, and then marching backward, and then average the two. Here, we will use the EWMA function provided by the pandas module.

import pandas, numpy as np
ewma = pandas.stats.moments.ewma

# make a hat function, and add noise
x = np.linspace(0,1,100)
x = np.hstack((x,x[::-1]))
x += np.random.normal( loc=0, scale=0.1, size=200 )
plot( x, alpha=0.4, label='Raw' )

# take EWMA in both directions with a smaller span term
fwd = ewma( x, span=15 ) # take EWMA in fwd direction
bwd = ewma( x[::-1], span=15 ) # take EWMA in bwd direction
c = np.vstack(( fwd, bwd[::-1] )) # lump fwd and bwd together
c = np.mean( c, axis=0 ) # average

# regular EWMA, with bias against trend
plot( ewma( x, span=20 ), 'b', label='EWMA, span=20' )

# "corrected" (?) EWMA
plot( c, 'r', label='Reversed-Recombined' )

savefig( 'ewma_correction.png', fmt='png', dpi=100 )

Holt-Winters Second Order EWMA

The Holt-Winters second order method attempts to incorporate the estimated trend into the smoothed data, using a {b_{t}} term that keeps track of the slope of the original signal. The smoothed signal is written to the s_{t} term.

(3)    \begin{align*}   s_{0} &= x_{0} \   b_{0} &= 0 \   s_{t} &= \alpha x_{t} + ( 1 - \alpha )( s_{t-1} + b_{t-1} ) \   b_{t} &= \beta ( s_{t} - s_{t-1} ) + ( 1 - \beta ) b_{t-1} \end{align*}

And here is some Python code implementing the Holt-Winters second order method on another noisy hat function, as before.

import numpy as np

def holt_winters_second_order_ewma( x, span, beta ):
    N = x.size
    alpha = 2.0 / ( 1 + span )
    s = np.zeros(( N, ))
    b = np.zeros(( N, ))
    s[0] = x[0]
    for i in range( 1, N ):
        s[i] = alpha * x[i] + ( 1 - alpha )*( s[i-1] + b[i-1] )
        b[i] = beta * ( s[i] - s[i-1] ) + ( 1 - beta ) * b[i-1]
    return s

# make a hat function, and add noise
x = np.linspace(0,1,100)
x = np.hstack((x,x[::-1]))
x += np.random.normal( loc=0, scale=0.1, size=200 ) + 3.0
plot( x, alpha=0.4, label='Raw' )

# holt winters second order ewma
plot( holt_winters_second_order_ewma( x, 10, 0.3 ), 'b', label='Holt-Winters' )

title('Holt-Winters' )
legend( loc=8 )

savefig( 'holt_winters.png', fmt='png', dpi=100 )

11 thoughts on “Smoothing with Exponentially Weighted Moving Averages”

  1. Awesome. Well-written and well-documented. Loved the write up in general, but especially the differences between SMA and EWMA really helped me out.

    Cheers! Keep up the good work.

    1. Thanks, if you have any requests or anything, I like looking at new things.

  2. Very interesting write-up, thank you! How do you choose the beta parameter in the Holt-Winters algorithm?

    1. For this example, I experimented with a few terms for beta and picked the one I liked the best, but that’s an interesting question. I’ll look into developing some kind of rule of thumb based on the variance and autocorrelation or something.

    2. I’ve done Holt-Winters for forecasting in excel and there you can use solver to set the variables to minimize the Mean absolute error (MAE). I’m new to python but you could possibly create a loop that iterates through values to find the value with the smallest MAE.

  3. To clarify, is this method essentially making the smoothing decay on both sides of the value, instead of just decaying backwards as in a traditional EWMA? In other words, is there a window surrounding each value that decays symmetrically away from the value? If we are smoothing using EWMA for time series that is “live” (i.e. stock prices), will this techniques will not work for the most recent data point, correct?

    1. You’re correct, the Holt-Winters method is not great for signals with unstable mean and variance. If you would like to simply smooth something that is not “live”, then you can do that by running the Holt-Winters method in both directions, in “live” applications, such as predicting stock prices, that would not work so well.

Comments are closed.