Conference paper

BENEŠ Karel, KESIRAJU Santosh and BURGET Lukáš. i-Vectors in Language Modeling: An EfficientWay of Domain Adaptation for Feed-Forward Models. In: Proceedings of Interspeech 2018. Hyderabad: International Speech Communication Association, 2018, pp. 3383-3387. ISSN 1990-9770. Available from: https://www.isca-speech.org/archive/Interspeech_2018/abstracts/1070.html
Publication language:english
Original title:i-Vectors in Language Modeling: An EfficientWay of Domain Adaptation for Feed-Forward Models
Title (cs):i-vektory pro jazykové modelování: efektivní způsob doménové adaptace s dopřednými modely
Pages:3383-3387
Proceedings:Proceedings of Interspeech 2018
Conference:Interspeech 2018
Place:Hyderabad, IN
Year:2018
URL:https://www.isca-speech.org/archive/Interspeech_2018/abstracts/1070.html
Journal:Proceedings of Interspeech - on line, Vol. 2018, No. 9, BAIXAS, FR
ISSN:1990-9770
DOI:10.21437/Interspeech.2018-1070
Publisher:International Speech Communication Association
URL:http://www.fit.vutbr.cz/research/groups/speech/publi/2018/benes_interspeech2018_1070.pdf [PDF]
Keywords
language modeling, feed-forward models, subspace multinomial model, domain adaptation
Annotation
We show an effective way of adding context information to shallow neural language models. We propose to use Subspace Multinomial Model (SMM) for context modeling and we add the extracted i-vectors in a computationally efficient way. By adding this information, we shrink the gap between shallow feed-forward network and an LSTM from 65 to 31 points of perplexity on the Wikitext-2 corpus (in the case of neural 5-gram model). Furthermore, we show that SMM i-vectors are suitable for domain adaptation and a very small amount of adaptation data (e.g. endmost 5% of a Wikipedia article) brings a substantial improvement. Our proposed changes are compatible with most optimization techniques used for shallow feedforward LMs.
BibTeX:
@INPROCEEDINGS{
   author = {Karel Bene{\v{s}} and Santosh Kesiraju and Luk{\'{a}}{\v{s}}
	Burget},
   title = {i-Vectors in Language Modeling: An EfficientWay of Domain
	Adaptation for Feed-Forward Models},
   pages = {3383--3387},
   booktitle = {Proceedings of Interspeech 2018},
   journal = {Proceedings of Interspeech - on line},
   volume = {2018},
   number = {9},
   year = {2018},
   location = {Hyderabad, IN},
   publisher = {International Speech Communication Association},
   ISSN = {1990-9770},
   doi = {10.21437/Interspeech.2018-1070},
   language = {english},
   url = {http://www.fit.vutbr.cz/research/view_pub.php?id=11842}
}

Your IPv4 address: 54.224.150.24
Switch to IPv6 connection

DNSSEC [dnssec]