Arkaplan Veri Süresinin Konuşmacı Doğrulama Performansına Etkisi
Gaussian mixture models with universal background model (GMM-UBM) and vector quantization with universal background model (VQ-UBM) are the two well-known classifiers used for speaker verification. Generally, UBM is trained with many hours of speech from a large pool of different speakers. In this st...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Bursa Uludag University
2013-04-01
|
Series: | Uludağ University Journal of The Faculty of Engineering |
Subjects: | |
Online Access: | http://mmfdergi.uludag.edu.tr/article/view/5000082467 |
_version_ | 1797917928583069696 |
---|---|
author | Cemal HANİLÇİ Figen ERTAŞ |
author_facet | Cemal HANİLÇİ Figen ERTAŞ |
author_sort | Cemal HANİLÇİ |
collection | DOAJ |
description | Gaussian mixture models with universal background model (GMM-UBM) and vector quantization with universal background model (VQ-UBM) are the two well-known classifiers used for speaker verification. Generally, UBM is trained with many hours of speech from a large pool of different speakers. In this study, we analyze the effect of data duration used to train UBM on text-independent speaker verification performance using GMM-UBM and VQ-UBM modeling techniques. Experiments carried out NIST 2002 speaker recognition evaluation (SRE) corpus show that background data duration to train UBM has small impact on recognition performance for GMM-UBM and VQ-UBM classifiers |
first_indexed | 2024-04-10T13:21:22Z |
format | Article |
id | doaj.art-68578b92c0cb4b14bd13ceb7156aa38c |
institution | Directory Open Access Journal |
issn | 2148-4147 2148-4155 |
language | English |
last_indexed | 2024-04-10T13:21:22Z |
publishDate | 2013-04-01 |
publisher | Bursa Uludag University |
record_format | Article |
series | Uludağ University Journal of The Faculty of Engineering |
spelling | doaj.art-68578b92c0cb4b14bd13ceb7156aa38c2023-02-15T16:12:02ZengBursa Uludag UniversityUludağ University Journal of The Faculty of Engineering2148-41472148-41552013-04-0118111111910.17482/uujfe.973555000077171Arkaplan Veri Süresinin Konuşmacı Doğrulama Performansına EtkisiCemal HANİLÇİFigen ERTAŞGaussian mixture models with universal background model (GMM-UBM) and vector quantization with universal background model (VQ-UBM) are the two well-known classifiers used for speaker verification. Generally, UBM is trained with many hours of speech from a large pool of different speakers. In this study, we analyze the effect of data duration used to train UBM on text-independent speaker verification performance using GMM-UBM and VQ-UBM modeling techniques. Experiments carried out NIST 2002 speaker recognition evaluation (SRE) corpus show that background data duration to train UBM has small impact on recognition performance for GMM-UBM and VQ-UBM classifiershttp://mmfdergi.uludag.edu.tr/article/view/5000082467Speaker verification, Gaussian mixture model, Vector Quantization, Universal background model |
spellingShingle | Cemal HANİLÇİ Figen ERTAŞ Arkaplan Veri Süresinin Konuşmacı Doğrulama Performansına Etkisi Uludağ University Journal of The Faculty of Engineering Speaker verification, Gaussian mixture model, Vector Quantization, Universal background model |
title | Arkaplan Veri Süresinin Konuşmacı Doğrulama Performansına Etkisi |
title_full | Arkaplan Veri Süresinin Konuşmacı Doğrulama Performansına Etkisi |
title_fullStr | Arkaplan Veri Süresinin Konuşmacı Doğrulama Performansına Etkisi |
title_full_unstemmed | Arkaplan Veri Süresinin Konuşmacı Doğrulama Performansına Etkisi |
title_short | Arkaplan Veri Süresinin Konuşmacı Doğrulama Performansına Etkisi |
title_sort | arkaplan veri suresinin konusmaci dogrulama performansina etkisi |
topic | Speaker verification, Gaussian mixture model, Vector Quantization, Universal background model |
url | http://mmfdergi.uludag.edu.tr/article/view/5000082467 |
work_keys_str_mv | AT cemalhanilci arkaplanverisuresininkonusmacıdogrulamaperformansınaetkisi AT figenertas arkaplanverisuresininkonusmacıdogrulamaperformansınaetkisi |