دورية أكاديمية

S2‐Net: Machine reading comprehension with SRU‐based self‐matching networks.

التفاصيل البيبلوغرافية
العنوان: S2‐Net: Machine reading comprehension with SRU‐based self‐matching networks.
المؤلفون: Park, Cheoneum, Lee, Changki, Hong, Lynn, Hwang, Yigyu, Yoo, Taejoon, Jang, Jaeyong, Hong, Yunki, Bae, Kyung‐Hoon, Kim, Hyun‐Ki
المصدر: ETRI Journal; Jun2019, Vol. 41 Issue 3, p371-382, 12p
مصطلحات موضوعية: READING comprehension, RECURRENT neural networks, COMPREHENSION testing, MACHINING, SHORT-term memory, MOSQUITO nets
مستخلص: Machine reading comprehension is the task of understanding a given context and finding the correct response in that context. A simple recurrent unit (SRU) is a model that solves the vanishing gradient problem in a recurrent neural network (RNN) using a neural gate, such as a gated recurrent unit (GRU) and long short‐term memory (LSTM); moreover, it removes the previous hidden state from the input gate to improve the speed compared to GRU and LSTM. A self‐matching network, used in R‐Net, can have a similar effect to coreference resolution because the self‐matching network can obtain context information of a similar meaning by calculating the attention weight for its own RNN sequence. In this paper, we construct a dataset for Korean machine reading comprehension and propose an S2‐Net model that adds a self‐matching layer to an encoder RNN using multilayer SRU. The experimental results show that the proposed S2‐Net model has performance of single 68.82% EM and 81.25% F1, and ensemble 70.81% EM, 82.48% F1 in the Korean machine reading comprehension test dataset, and has single 71.30% EM and 80.37% F1 and ensemble 73.29% EM and 81.54% F1 performance in the SQuAD dev dataset. [ABSTRACT FROM AUTHOR]
Copyright of ETRI Journal is the property of Wiley-Blackwell and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
قاعدة البيانات: Supplemental Index
الوصف
تدمد:12256463
DOI:10.4218/etrij.2017-0279