MARC보기
LDR00000nam u2200205 4500
001000000432460
00520200224130944
008200131s2019 ||||||||||||||||| ||eng d
020 ▼a 9781088301753
035 ▼a (MiAaPQ)AAI13895340
040 ▼a MiAaPQ ▼c MiAaPQ ▼d 247004
0820 ▼a 400
1001 ▼a Foster, James Michael.
24510 ▼a Integrating Analogical Similarity and Error-driven Learning for Relational Concept Acquisition.
260 ▼a [S.l.]: ▼b University of Colorado at Boulder., ▼c 2019.
260 1 ▼a Ann Arbor: ▼b ProQuest Dissertations & Theses, ▼c 2019.
300 ▼a 140 p.
500 ▼a Source: Dissertations Abstracts International, Volume: 81-04, Section: A.
500 ▼a Advisor: Jones, Matt.
5021 ▼a Thesis (Ph.D.)--University of Colorado at Boulder, 2019.
506 ▼a This item must not be sold to any third party vendors.
520 ▼a How can people and machines learn new relational concepts, such as the concept of 'fork' in chess, or the grammar of sentences? The approach taken in this work is to develop a theoretical and computational framework for how such concepts are learned and applied. The framework integrates established principles of cognition (analogy and error-driven learning) and explores their computational power and empirical validity. The first chapter of this thesis presents computational models and simulation results in the domain of two-player adversarial games. These models demonstrate how a synthesis of analogy and reinforcement learning (RL) provides a framework for constructing abstract relational concepts and evaluating their usefulness. The second chapter describes experiments with humans that qualitatively test the model predictions. These experiments demonstrate how reward feedback and frequency affect the reinforcement of relational concepts. The third chapter explores how the framework can be extended to model grammar learning in language processing. This model extension demonstrates the viability of tensor-based representations (HRRs) as the interface between an analogical meaning system and a recurrent neural network (RNN) sequencing system in the domain of sentence production. Together, these models offer solutions to the now three-decade challenge of unifying the flexibility and fluidity of deep learning with the expressive power of compositional representations.
590 ▼a School code: 0051.
650 4 ▼a Artificial intelligence.
650 4 ▼a Cognitive psychology.
650 4 ▼a Language.
690 ▼a 0800
690 ▼a 0633
690 ▼a 0679
71020 ▼a University of Colorado at Boulder. ▼b Psychology.
7730 ▼t Dissertations Abstracts International ▼g 81-04A.
773 ▼t Dissertation Abstract International
790 ▼a 0051
791 ▼a Ph.D.
792 ▼a 2019
793 ▼a English
85640 ▼u http://www.riss.kr/pdu/ddodLink.do?id=T15491594 ▼n KERIS ▼z 이 자료의 원문은 한국교육학술정보원에서 제공합니다.
980 ▼a 202002 ▼f 2020
990 ▼a ***1008102
991 ▼a E-BOOK