Mathematical Institute of the Serbian Academy of Sciences and Arts
|Title:||Measure of similarity between gmms by embedding of the parameter space that preserves kl divergence||Journal:||Mathematics||Volume:||9||Issue:||9||First page:||Art. no. 957||Issue Date:||1-May-2021||Rank:||~M21a||ISSN:||2227-7390||DOI:||10.3390/math9090957||Abstract:||
In this work, we deliver a novel measure of similarity between Gaussian mixture models (GMMs) by neighborhood preserving embedding (NPE) of the parameter space, that projects components of GMMs, which by our assumption lie close to lower dimensional manifold. By doing so, we obtain a transformation from the original high-dimensional parameter space, into a much lower-dimensional resulting parameter space. Therefore, resolving the distance between two GMMs is reduced to (taking the account of the corresponding weights) calculating the distance between sets of lower-dimensional Euclidean vectors. Much better trade-off between the recognition accuracy and the computational complexity is achieved in comparison to measures utilizing distances between Gaussian components evaluated in the original parameter space. The proposed measure is much more efficient in machine learning tasks that operate on large data sets, as in such tasks, the required number of overall Gaussian components is always large. Artificial, as well as real-world experiments are conducted, showing much better trade-off between recognition accuracy and computational complexity of the proposed measure, in comparison to all baseline measures of similarity between GMMs tested in this paper.
|Keywords:||Dimensionality reduction | Gaussian mixture models | KL-divergence | Similarity measures||Publisher:||MDPI|
Show full item record
checked on Oct 3, 2022
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.