Conference paper

PLCHOT Oldřich, MATĚJKA Pavel, SILNOVA Anna, NOVOTNÝ Ondřej, DIEZ Sánchez Mireia, ROHDIN Johan A., GLEMBEK Ondřej, BRÜMMER Niko, SWART Albert du Preez, PRIETO Jesús J., GARCIA Perera Leibny Paola, BUERA Luis, KENNY Patrick, ALAM Jahangir and BHATTACHARYA Gautam. Analysis and Description of ABC Submission to NIST SRE 2016. In: Proceedings of Interspeech 2017. Stockholm: International Speech Communication Association, 2017, pp. 1348-1352. ISSN 1990-9772. Available from: http://www.isca-speech.org/archive/Interspeech_2017/pdfs/1498.PDF
Publication language:english
Original title:Analysis and Description of ABC Submission to NIST SRE 2016
Title (cs):Analýza a popis ABC systému pro NIST SRE 2016
Pages:1348-1352
Proceedings:Proceedings of Interspeech 2017
Conference:Interspeech 2017
Place:Stockholm, SE
Year:2017
URL:http://www.isca-speech.org/archive/Interspeech_2017/pdfs/1498.PDF
Journal:Proceedings of Interspeech, Vol. 2017, No. 08, FR
ISSN:1990-9772
DOI:10.21437/Interspeech.2017-1498
Publisher:International Speech Communication Association
URL:http://www.fit.vutbr.cz/research/groups/speech/publi/2017/plchot_interspeech2017_IS171498.pdf [PDF]
Keywords
speaker recognition, i-vector, DNN, fusion
Annotation
This article is about the analysis and description of ABC Submission to NIST SRE 2016.We have presented various sytems of the ABC team that are designed to cope with dataset mismatch and non-English data. We have presented and compared several fusion and calibration strategies and we have uncovered and discussed problems brought by SRE16.
Abstract
We present a condensed description and analysis of the joint submission for NIST SRE 2016, by Agnitio, BUT and CRIM (ABC). We concentrate on challenges that arose during development and we analyze the results obtained on the evaluation data and on our development sets. We show that testing on mismatched, non-English and short duration data introduced in NIST SRE 2016 is a difficult problem for current state-of-theart systems. Testing on this data brought back the issue of score normalization and it also revealed that the bottleneck features (BN), which are superior when used for telephone English, are lacking in performance against the standard acoustic features like Mel Frequency Cepstral Coefficients (MFCCs). We offer ABCs insights, findings and suggestions for building a robust system suitable for mismatched, non-English and relatively noisy data such as those in NIST SRE 2016.
BibTeX:
@INPROCEEDINGS{
   author = {Old{\v{r}}ich Plchot and Pavel Mat{\v{e}}jka and
	Anna Silnova and Ond{\v{r}}ej Novotn{\'{y}} and
	Mireia S{\'{a}}nchez Diez and A. Johan Rohdin and
	Ond{\v{r}}ej Glembek and Niko Br{\"{u}}mmer and
	Preez du Albert Swart and J. Jes{\'{u}}s Prieto
	and Paola Leibny Perera Garcia and Luis Buera and
	Patrick Kenny and Jahangir Alam and Gautam
	Bhattacharya},
   title = {Analysis and Description of ABC Submission to NIST
	SRE 2016},
   pages = {1348--1352},
   booktitle = {Proceedings of Interspeech 2017},
   journal = {Proceedings of Interspeech},
   volume = {2017},
   number = {08},
   year = {2017},
   location = {Stockholm, SE},
   publisher = {International Speech Communication Association},
   ISSN = {1990-9772},
   doi = {10.21437/Interspeech.2017-1498},
   language = {english},
   url = {http://www.fit.vutbr.cz/research/view_pub.php.en?id=11581}
}

Your IPv4 address: 54.167.15.6
Switch to IPv6 connection

DNSSEC [dnssec]