2.5 指数加权平均的偏差修正

2.5 指数加权平均的偏差修正

In a previous video, you saw this figure for beta = 0.9. This figure for beta = 0.98. But it turns out that if you implement the formula as written here, you won’t actually get the green curve when, say, beta = 0.98. You actually get the purple curve here. And you notice that the purple curve starts off really low.

β=0.98时:

v0=0v1=0.98v0+0.02θ1=0.02θ1v2=0.98v1+0.02θ2=0.980.02θ1+0.02θ2

可以看到,在刚开始的时候,指数移动加权平均对于温度的估计是偏小的,为了修正这一偏差,令:

vt=vt1βt

So it turns out that there is a way to modify this estimate that makes it much better, that makes it more accurate, especially during this initial phase of your estimate. Which is that, instead of taking Vt, take Vt divided by 1- Beta to the power of t where t is the current data here on.

So let’s take a concrete example. When t = 2, 1- beta to the power of t is 1- 0.98 squared and it urns out that this is 0.0396.

1β2=10.982=0.0396

v2=v210.982=0.0196θ1+0.02θ20.0396

And so your estimate of the temperature on day 2 becomes v2 divided by 0.0396. You notice that these two things adds up to the denominator 0.03 and 6. And so this becomes a weighted average of theta 1 and theta 2 and this removes this bias.

So you notice that as t becomes large, beta to the t will approach 0 which is why when t is large enough, the bias correction makes almost no difference. This is why when t is large, the purple line and the green line pretty much overlap. But during this initial phase of learning when you’re still warming up your estimates when the bias correction can help you to obtain a better estimate of this temperature. And it is this bias correction that helps you go from the purple line to the green line.

So in machine learning, for most implementations of the exponential weighted average, people don’t often bother to implement bias corrections. Because most people would rather just wait that initial period and have a slightly more biased estimate and go from there. But if you are concerned about the bias during this initial phase, while your exponentially weighted moving average is still warming up. Then bias correction can help you get a better estimate early on. So you now know how to implement exponentially weighted moving averages. Let’s go on and use this to build some better optimization algorithms.