On Extended Long Short-term Memory and Dependent Bidirectional Recurrent Neural Network

02/27/2018
by   Yuanhang Su, et al.
0

In this work, we investigate the memory capability of recurrent neural networks (RNNs), where this capability is defined as a function that maps an element in a sequence to the current output. We first analyze the system function of a recurrent neural network (RNN) cell, and provide analytical results for three RNNs. They are the simple recurrent neural network (SRN), the long short-term memory (LSTM), and the gated recurrent unit (GRU). Based on the analysis, we propose a new design to extend the memory length of a cell, and call it the extended long short-term memory (ELSTM). Next, we present a dependent bidirectional recurrent neural network (DBRNN) for the sequence-in-sequence-out (SISO) problem, which is more robust to previous erroneous predictions. Extensive experiments are carried out on different language tasks to demonstrate the superiority of our proposed ELSTM and DBRNN solutions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset