대구한의대학교 향산도서관

상세정보

부가기능

Low-Rank RNN Adaptation for Context-Aware Language Modeling

상세 프로파일

상세정보
자료유형학위논문
서명/저자사항Low-Rank RNN Adaptation for Context-Aware Language Modeling.
개인저자Jaech, Aaron.
단체저자명University of Washington. Electrical Engineering.
발행사항[S.l.]: University of Washington., 2018.
발행사항Ann Arbor: ProQuest Dissertations & Theses, 2018.
형태사항124 p.
기본자료 저록Dissertation Abstracts International 79-12B(E).
Dissertation Abstract International
ISBN9780438178175
학위논문주기Thesis (Ph.D.)--University of Washington, 2018.
일반주기 Source: Dissertation Abstracts International, Volume: 79-12(E), Section: B.
Adviser: Mari Ostendorf.
요약A long-standing weakness of statistical language models is that their performance drastically degrades if they are used on data that varies even slightly from the data on which they were trained. In practice, applications require the use of adap
요약The current standard approach to recurrent neural network language model adaptation is to apply a simple linear shift to the recurrent and/or output layer bias vector. Although this is helpful, it does not go far enough. This thesis introduces a
요약In our experiments on several different datasets and multiple types of context, the increased adaptation of the recurrent layer is always helpful, as measured by perplexity, the standard for evaluating language models. We also demonstrate impact
일반주제명Computer science.
Statistics.
언어영어
바로가기URL : 이 자료의 원문은 한국교육학술정보원에서 제공합니다.

서평(리뷰)

  • 서평(리뷰)

태그

  • 태그

나의 태그

나의 태그 (0)

모든 이용자 태그

모든 이용자 태그 (0) 태그 목록형 보기 태그 구름형 보기
 
로그인폼