자료유형 | 학위논문 |
---|---|
서명/저자사항 | Low-Rank RNN Adaptation for Context-Aware Language Modeling. |
개인저자 | Jaech, Aaron. |
단체저자명 | University of Washington. Electrical Engineering. |
발행사항 | [S.l.]: University of Washington., 2018. |
발행사항 | Ann Arbor: ProQuest Dissertations & Theses, 2018. |
형태사항 | 124 p. |
기본자료 저록 | Dissertation Abstracts International 79-12B(E). Dissertation Abstract International |
ISBN | 9780438178175 |
학위논문주기 | Thesis (Ph.D.)--University of Washington, 2018. |
일반주기 |
Source: Dissertation Abstracts International, Volume: 79-12(E), Section: B.
Adviser: Mari Ostendorf. |
요약 | A long-standing weakness of statistical language models is that their performance drastically degrades if they are used on data that varies even slightly from the data on which they were trained. In practice, applications require the use of adap |
요약 | The current standard approach to recurrent neural network language model adaptation is to apply a simple linear shift to the recurrent and/or output layer bias vector. Although this is helpful, it does not go far enough. This thesis introduces a |
요약 | In our experiments on several different datasets and multiple types of context, the increased adaptation of the recurrent layer is always helpful, as measured by perplexity, the standard for evaluating language models. We also demonstrate impact |
일반주제명 | Computer science. Statistics. |
언어 | 영어 |
바로가기 |
: 이 자료의 원문은 한국교육학술정보원에서 제공합니다. |