Introduction
In this page, I will describe a brief explanation on the Gaussian Mixture and the EM Algorithm. (Here, I show how to use the implementation of the EM Algorithm the OpenCV provides to us.)Gaussian Mixture
Suppose we have a probability of an observed datawhere,
When we have a data set consisting of N observations
By substituting the Gaussian mixture representation into the equation, we obtain
An elegant and powerful method to decide unknown parameters in the right hand side,
EM Algorithm
To make notations simple, we introduce the following variables:Then,
All we have to do is to maximize the following quantity:
For simplicity, we maximize the log of the likelihood function given by
Before going any further, we introduce a constraint on
The probability
If we integrate both sides of the equation with respect to
It should be noticed that the individual Gaussian components are normalized. Using a Lagrange multiplier, the quantity to maximize taking into account the constraint is written as
The rest of our work is setting the derivatives of
- The derivative with respect to
yields the following equation:
where,
.
- The derivative with respect to
yields the following equation:
where, T indicates transposition. - The derivative with respect to
yields the following equation:
.
- Set initial values to
.
- Calculate
using the current parameter values.
- Using
, calculate three parameters
.
- Calculate
.
- Until
or
converge, repeat the loop from the step 2 to the step 4.
The weight is given by
where,
0 件のコメント:
コメントを投稿