Lower and upper bounds for approximation of the Kullback-Leibler divergence between Gaussian Mixture Models

Details

Ressource 1Download: BIB_513DF4E21898.P001.pdf (604.21 [Ko])
State: Public
Version: author
Serval ID
serval:BIB_513DF4E21898
Type
Inproceedings: an article in a conference proceedings.
Collection
Publications
Title
Lower and upper bounds for approximation of the Kullback-Leibler divergence between Gaussian Mixture Models
Title of the conference
ICASSP 2012, IEEE International Conference On Acoustics, Speech And Signal Processing
Author(s)
Durrieu J.L., Thiran J.P., Kelly Finnian P.
Address
Kyoto, Japan, March 25-30, 2012
ISBN
978-1-4673-0046-9
Publication state
Published
Issued date
2012
Volume
2012
Series
IEEE International Conference On Acoustics, Speech And Signal Processing
Pages
4833-4836
Language
english
Abstract
Many speech technology systems rely on Gaussian Mixture
Models (GMMs). The need for a comparison between two GMMs
arises in applications such as speaker verification,
model selection or parameter estimation. For this
purpose, the Kullback-Leibler (KL) divergence is often
used. However, since there is no closed form expression
to compute it, it can only be approximated. We propose
lower and upper bounds for the KL divergence, which lead
to a new approximation and interesting insights into
previously proposed approximations. An application to the
comparison of speaker models also shows how such
approximations can be used to validate assumptions on the
models.
Keywords
Gaussian Mixture Model (GMM), Kullback-Leibler, Divergence, speaker comparison, speech processing
Create date
06/01/2014 22:07
Last modification date
20/08/2019 15:06
Usage data