Divergence Based Gaussian Mixture Learning
- Research topic/area
- Artificial Intelligence
- Type of thesis
- Bachelor / Master
- Start time
- -
- Application deadline
- 30.06.2028
- Duration of the thesis
- 4 months(BSc) - 6 months(MSc)
Description
Distances and divergences are crucial for understanding and working with Gaussian Mixture Models (GMMs). They quantify how similar or different two GMMs are, enabling tasks like clustering, model comparison, and anomaly detection. Unlike simple metrics, divergences such as Kullback-Leibler (KL) or Wasserstein distances capture the structure of probabilistic distributions, accounting for both mean and covariance differences. These measures are essential for optimizing GMM parameters, evaluating convergence, and performing model selection. Accurate distance calculations also support applications in signal processing, computer vision, and machine learning, where nuanced distinctions between data distributions are vital for performance and interpretability.Requirement
- Requirements for students
-
- There are no hard constraints but the more programming and math you know the more you can have fun while doing the project.
- Faculty departments
-
- Engineering sciences
Informatics
- Engineering sciences
Supervision
- Title, first name, last name
- Ali Darijani
- Organizational unit
- Computer Science(IAR/IES)
- Email address
- ali.darijani@iosb.fraunhofer.de
- Link to personal homepage/personal page
- Website
Application via email
- Application documents
-
E-Mail Address for application
Senden Sie die oben genannten Bewerbungsunterlagen bitte per Mail an ali.darijani@iosb.fraunhofer.de
Back