Hellinger divergence in information theoretic novelty detection
- In this work a novelty detection framework provided by M. Filippone and G. Sanguinetti is considered, which is useful especially when only few training samples are available. It is restricted to Gaussian mixture models and makes use of information theory, applying the Kullback-Leibler divergence. In this work two variations of the framework are presented, applying the symmetric Hellinger divergence and a statistical likelihood approach.
Author: | Paul Stürmer |
---|---|
URN: | urn:nbn:de:bsz:mit1-opus-46323 |
Document Type: | Master's Thesis |
Language: | English |
Date of Publication (online): | 2014/11/24 |
Publishing Institution: | Hochschule Mittweida |
Release Date: | 2014/11/24 |
GND Keyword: | Wahrscheinlichkeitsverteilung |
Institutes: | 03 Mathematik / Naturwissenschaften / Informatik |
DDC classes: | 510 Mathematik |
Open Access: | Innerhalb der Hochschule |
Licence (German): | ![]() |