MARC보기
LDR00000nam u2200205 4500
001000000435345
00520200228093219
008200131s2019 ||||||||||||||||| ||eng d
020 ▼a 9781085783392
035 ▼a (MiAaPQ)AAI13880665
040 ▼a MiAaPQ ▼c MiAaPQ ▼d 247004
0820 ▼a 001
1001 ▼a Song, Linfeng.
24510 ▼a Tackling Graphical NLP Problems with Graph Recurrent Networks.
260 ▼a [S.l.]: ▼b University of Rochester., ▼c 2019.
260 1 ▼a Ann Arbor: ▼b ProQuest Dissertations & Theses, ▼c 2019.
300 ▼a 149 p.
500 ▼a Source: Dissertations Abstracts International, Volume: 81-04, Section: B.
500 ▼a Advisor: Gildea, Daniel.
5021 ▼a Thesis (Ph.D.)--University of Rochester, 2019.
506 ▼a This item must not be sold to any third party vendors.
520 ▼a How to properly model graphs is a long-existing and important problem in natural language processing, where several popular types of graphs are knowledge graphs, semantic graphs and dependency graphs. Comparing with other data structures, such as sequences and trees, graphs are generally more powerful in representing complex correlations among entities. For example, a knowledge graph stores real-word entities (such as "Barack_Obama'' and "U.S.'') and their relations (such as "live_in'' and "lead_by'"). Properly encoding a knowledge graph is beneficial to user applications, such as question answering and knowledge discovery. Modeling graphs is also very challenging, probably because graphs usually contain massive and cyclic relations. For instance, a tree with n nodes has n - 1 edges (relations), while a complete graph with n nodes can have O(n2) edges (relations).Recent years have witnessed the success of deep learning, especially RNN-based models, on many NLP problems, including machine translation Cho et al., 2014) and question answering (Shen et al., 2017). Besides, RNNs and their variations have been extensively studied on several graph problems and showed preliminary successes. Despite the successes that have been achieved, RNN-based models suffer from several major drawbacks. First, they can only consume sequential data, thus linearization is required to serialize input graphs, resulting in the loss of important structural information. In particular, originally closely located graph nodes can be very far away after linearization, and this introduces great challenge for RNNs to model their relation. Second, the serialization results are usually very long, so it takes a long time for RNNs to encode them.In this thesis, we propose a novel graph neural network, named graph recurrent network (GRN). GRN takes a hidden state for each graph node, and it relies on an iterative message passing framework to update these hidden states in parallel. Within each iteration, neighboring nodes exchange information between each other, so that they absorb more global knowledge. Different from RNNs, which require absolute orders (such as left-to-right orders) for execution, our GRN only require relative neighboring information, making it very general and flexible on a variety of data structures.We study our GRN model on 4 very different tasks, such as machine reading comprehension, relation extraction and machine translation. Some tasks (such as machine translation) require generating sequences, while others only require one decision (classification). Some take undirected graphs without edge labels, while the others have directed ones with edge labels. To consider these important differences, we gradually enhance our GRN model, such as further considering edge labels and adding an RNN decoder. Carefully designed experiments show the effectiveness of GRN on all these tasks.
590 ▼a School code: 0188.
650 4 ▼a Computer science.
650 4 ▼a Artificial intelligence.
690 ▼a 0984
690 ▼a 0800
71020 ▼a University of Rochester. ▼b Hajim School of Engineering and Applied Sciences.
7730 ▼t Dissertations Abstracts International ▼g 81-04B.
773 ▼t Dissertation Abstract International
790 ▼a 0188
791 ▼a Ph.D.
792 ▼a 2019
793 ▼a English
85640 ▼u http://www.riss.kr/pdu/ddodLink.do?id=T15491161 ▼n KERIS ▼z 이 자료의 원문은 한국교육학술정보원에서 제공합니다.
980 ▼a 202002 ▼f 2020
990 ▼a ***1816162
991 ▼a E-BOOK