대구한의대학교 향산도서관

상세정보

부가기능

Learning Transferable Knowledge Through Embedding Spaces

상세 프로파일

상세정보
자료유형학위논문
서명/저자사항Learning Transferable Knowledge Through Embedding Spaces.
개인저자Rostami, Mohammad.
단체저자명University of Pennsylvania. Electrical and Systems Engineering.
발행사항[S.l.]: University of Pennsylvania., 2019.
발행사항Ann Arbor: ProQuest Dissertations & Theses, 2019.
형태사항257 p.
기본자료 저록Dissertations Abstracts International 81-05B.
Dissertation Abstract International
ISBN9781088366875
학위논문주기Thesis (Ph.D.)--University of Pennsylvania, 2019.
일반주기 Source: Dissertations Abstracts International, Volume: 81-05, Section: B.
Advisor: Eaton, Eric R.
이용제한사항This item is not available from ProQuest Dissertations & Theses.This item must not be sold to any third party vendors.
요약The unprecedented processing demand, posed by the explosion of big data, challenges researchers to design efficient and adaptive machine learning algorithms that do not require persistent retraining and avoid learning redundant information. Inspired from learning techniques of intelligent biological agents, identifying transferable knowledge across learning problems has been a significant research focus to improve machine learning algorithms. In this thesis, we address the challenges of knowledge transfer through embedding spaces that capture and store hierarchical knowledge.In the first part of the thesis, we focus on the problem of cross-domain knowledge transfer. We first address zero-shot image classification, where the goal is to identify images from unseen classes using semantic descriptions of these classes. We train two coupled dictionaries which align visual and semantic domains via an intermediate embedding space. We then extend this idea by training deep networks that match data distributions of two visual domains in a shared cross-domain embedding space. Our approach addresses both semi-supervised and unsupervised domain adaptation settings.In the second part of the thesis, we investigate the problem of cross-task knowledge transfer. Here, the goal is to identify relations and similarities of multiple machine learning tasks to improve performance across the tasks. We first address the problem of zero-shot learning in a lifelong machine learning setting, where the goal is to learn tasks with no data using high-level task descriptions. Our idea is to relate high-level task descriptors to the optimal task parameters through an embedding space. We then develop a method to overcome the problem of catastrophic forgetting within continual learning setting of deep neural networks by enforcing the tasks to share the same distribution in the embedding space. We further demonstrate that our model can address the challenges of domain adaptation in the continual learning setting.Finally, we consider the problem of cross-agent knowledge transfer in the third part of the thesis. We demonstrate that multiple lifelong machine learning agents can collaborate to increase individual performance by sharing learned knowledge in an embedding space without sharing private data through a shared embedding space.We demonstrate that despite major differences, problems within the above learning scenarios can be tackled through learning an intermediate embedding space that allows transferring knowledge effectively.
일반주제명Computer science.
Electrical engineering.
언어영어
바로가기URL : 이 자료의 원문은 한국교육학술정보원에서 제공합니다.

서평(리뷰)

  • 서평(리뷰)

태그

  • 태그

나의 태그

나의 태그 (0)

모든 이용자 태그

모든 이용자 태그 (0) 태그 목록형 보기 태그 구름형 보기
 
로그인폼