[1] M. A. T. Figueiredo and A. K. Jain, "Unsupervised learning of finite mixture models," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 3, pp. 381-396, March 2002.
Install this python package:
pip install gmm-mml
This implementation is a port from the orginal authors matlab code with small modifications and it is built as a sklearn wrapper. The dependencies are:
numpy
scipy
sklearn
matplotlib (optional)
The following points were generated using three bivariate Gaussian distributions.
The clustering algorithm correctly converges to those distributions:from gmm_mml import GmmMml
unsupervised=GmmMml(plots=True)
unsupervised.fit(X)
It is also possible to visualize this process GmmMml(plots=True,live_2d_plot=False)
:
Available sklearn methods:
.fit()
- fit the finite mixture model.fit_transform()
- fit and return inputs posterior probability.transform()
- return inputs posterior probability.predict()
- return inputs cluster.predict_proba()
- same as.transform()
.sample()
- sample new data from the fitted mixture model
On folders ./example_scipts and ./tutorials there are examples on how to use the code
Jupyter notebooks: 2d_Example 1d_Example
- Refactoring
- Docs
- Support other covariance types (right now only 'full' is supported, i.e., each component has its own general covariance matrix)