diff --git a/posts/2014-07-Understanding-Convolutions/index.html b/posts/2014-07-Understanding-Convolutions/index.html index b00cf6e..f468aeb 100644 --- a/posts/2014-07-Understanding-Convolutions/index.html +++ b/posts/2014-07-Understanding-Convolutions/index.html @@ -99,7 +99,7 @@

Understanding Convolutions

If we just wanted to understand convolutional neural networks, it might suffice to roughly understand convolutions. But the aim of this series is to bring us to the frontier of convolutional neural networks and explore new options. To do that, we’re going to need to understand convolutions very deeply.

Thankfully, with a few examples, convolution becomes quite a straightforward idea.

Lessons from a Dropped Ball

-

Imagine we drop a ball from some height onto the ground, where it only has one dimension of motion. How likely is it that a ball will go a distance \(c\) if you drop it and then drop it again from above the point at which it landed?

+

Imagine we drop a ball from some height onto the ground. The ball first has an \(x\) coordinate \(x_0\). When it hits the ground, the ball rolls and reaches the position \(x_1\). Then we drop it again from above \(x_1\) and the ball ends to a position \(x_2\). How likely is it that the ball will go a given distance \(c\), i.e. that \(x_2\) - \(x_0\) = \(c\)?

Let’s break this down. After the first drop, it will land \(a\) units away from the starting point with probability \(f(a)\), where \(f\) is the probability distribution.

Now after this first drop, we pick the ball up and drop it from another height above the point where it first landed. The probability of the ball rolling \(b\) units away from the new starting point is \(g(b)\), where \(g\) may be a different probability distribution if it’s dropped from a different height.