MARC보기
LDR02118nam u200421 4500
001000000422422
00520190215170120
008181129s2018 |||||||||||||||||c||eng d
020 ▼a 9780438178175
035 ▼a (MiAaPQ)AAI10828817
035 ▼a (MiAaPQ)washington:18951
040 ▼a MiAaPQ ▼c MiAaPQ ▼d 247004
0820 ▼a 004
1001 ▼a Jaech, Aaron.
24510 ▼a Low-Rank RNN Adaptation for Context-Aware Language Modeling.
260 ▼a [S.l.]: ▼b University of Washington., ▼c 2018.
260 1 ▼a Ann Arbor: ▼b ProQuest Dissertations & Theses, ▼c 2018.
300 ▼a 124 p.
500 ▼a Source: Dissertation Abstracts International, Volume: 79-12(E), Section: B.
500 ▼a Adviser: Mari Ostendorf.
5021 ▼a Thesis (Ph.D.)--University of Washington, 2018.
520 ▼a A long-standing weakness of statistical language models is that their performance drastically degrades if they are used on data that varies even slightly from the data on which they were trained. In practice, applications require the use of adap
520 ▼a The current standard approach to recurrent neural network language model adaptation is to apply a simple linear shift to the recurrent and/or output layer bias vector. Although this is helpful, it does not go far enough. This thesis introduces a
520 ▼a In our experiments on several different datasets and multiple types of context, the increased adaptation of the recurrent layer is always helpful, as measured by perplexity, the standard for evaluating language models. We also demonstrate impact
590 ▼a School code: 0250.
650 4 ▼a Computer science.
650 4 ▼a Statistics.
690 ▼a 0984
690 ▼a 0463
71020 ▼a University of Washington. ▼b Electrical Engineering.
7730 ▼t Dissertation Abstracts International ▼g 79-12B(E).
773 ▼t Dissertation Abstract International
790 ▼a 0250
791 ▼a Ph.D.
792 ▼a 2018
793 ▼a English
85640 ▼u http://www.riss.kr/pdu/ddodLink.do?id=T14999220 ▼n KERIS ▼z 이 자료의 원문은 한국교육학술정보원에서 제공합니다.
980 ▼a 201812 ▼f 2019
990 ▼a ***1012033