Dokumentation (english)

Gaussian Mixture Model

Probabilistic soft clustering with per-component Gaussian distributions

Gaussian Mixture Model (GMM) models the data as a mixture of Gaussian distributions. Unlike K-Means, it produces soft assignments — the probability that each point belongs to each component.

When to use:

  • When cluster membership probabilities (uncertainty) are needed
  • Ellipsoidal clusters of varying size and orientation
  • Density estimation alongside clustering

Input: Tabular data with the feature columns defined during training Output: Most likely cluster label and component probabilities per row

Model Settings (set during training, used at inference)

N Components (default: 1) Number of mixture components (clusters). Tune during training using BIC or AIC.

Covariance Type (default: full) Shape of cluster covariance:

  • full — each component has its own full covariance matrix (most flexible)
  • tied — all components share one covariance matrix
  • diag — diagonal covariance per component
  • spherical — single variance per component

Tol (default: 0.001) EM convergence threshold.

Max Iter (default: 100) Maximum EM iterations.

Inference Settings

No dedicated inference-time settings. The trained Gaussian parameters determine soft cluster assignments.


Command Palette

Search for a command to run...

Schnellzugriffe
STRG + KSuche
STRG + DNachtmodus / Tagmodus
STRG + LSprache ändern

Software-Details
Kompiliert vor etwa 2 Stunden
Release: v4.0.0-production
Buildnummer: master@afa25ab
Historie: 72 Items