Lower and upper bounds for approximation of the Kullback-Leibler divergence between Gaussian Mixture Models

Détails

Ressource 1Télécharger: BIB_513DF4E21898.P001.pdf (604.21 [Ko])
Etat: Public
Version: de l'auteur⸱e
ID Serval
serval:BIB_513DF4E21898
Type
Actes de conférence (partie): contribution originale à la littérature scientifique, publiée à l'occasion de conférences scientifiques, dans un ouvrage de compte-rendu (proceedings), ou dans l'édition spéciale d'un journal reconnu (conference proceedings).
Collection
Publications
Titre
Lower and upper bounds for approximation of the Kullback-Leibler divergence between Gaussian Mixture Models
Titre de la conférence
ICASSP 2012, IEEE International Conference On Acoustics, Speech And Signal Processing
Auteur⸱e⸱s
Durrieu J.L., Thiran J.P., Kelly Finnian P.
Adresse
Kyoto, Japan, March 25-30, 2012
ISBN
978-1-4673-0046-9
Statut éditorial
Publié
Date de publication
2012
Volume
2012
Série
IEEE International Conference On Acoustics, Speech And Signal Processing
Pages
4833-4836
Langue
anglais
Résumé
Many speech technology systems rely on Gaussian Mixture
Models (GMMs). The need for a comparison between two GMMs
arises in applications such as speaker verification,
model selection or parameter estimation. For this
purpose, the Kullback-Leibler (KL) divergence is often
used. However, since there is no closed form expression
to compute it, it can only be approximated. We propose
lower and upper bounds for the KL divergence, which lead
to a new approximation and interesting insights into
previously proposed approximations. An application to the
comparison of speaker models also shows how such
approximations can be used to validate assumptions on the
models.
Mots-clé
Gaussian Mixture Model (GMM), Kullback-Leibler, Divergence, speaker comparison, speech processing
Création de la notice
06/01/2014 22:07
Dernière modification de la notice
20/08/2019 15:06
Données d'usage