Working Memory Connections for LSTM

08/31/2021
by   Federico Landi, et al.
0

Recurrent Neural Networks with Long Short-Term Memory (LSTM) make use of gating mechanisms to mitigate exploding and vanishing gradients when learning long-term dependencies. For this reason, LSTMs and other gated RNNs are widely adopted, being the standard de facto for many sequence modeling tasks. Although the memory cell inside the LSTM contains essential information, it is not allowed to influence the gating mechanism directly. In this work, we improve the gate potential by including information coming from the internal cell state. The proposed modification, named Working Memory Connection, consists in adding a learnable nonlinear projection of the cell content into the network gates. This modification can fit into the classical LSTM gates without any assumption on the underlying task, being particularly effective when dealing with longer sequences. Previous research effort in this direction, which goes back to the early 2000s, could not bring a consistent improvement over vanilla LSTM. As part of this paper, we identify a key issue tied to previous connections that heavily limits their effectiveness, hence preventing a successful integration of the knowledge coming from the internal cell state. We show through extensive experimental evaluation that Working Memory Connections constantly improve the performance of LSTMs on a variety of tasks. Numerical results suggest that the cell state contains useful information that is worth including in the gate structure.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/13/2018

The unreasonable effectiveness of the forget gate

Given the success of the gated recurrent unit, a natural question is whe...
research
03/23/2018

Can recurrent neural networks warp time?

Successful recurrent models such as long short-term memories (LSTMs) and...
research
08/04/2018

MCRM: Mother Compact Recurrent Memory

LSTMs and GRUs are the most common recurrent neural network architecture...
research
08/04/2018

MCRM: Mother Compact Recurrent Memory A Biologically Inspired Recurrent Neural Network Architecture

LSTMs and GRUs are the most common recurrent neural network architecture...
research
07/11/2018

Iterative evaluation of LSTM cells

In this work we present a modification in the conventional flow of infor...
research
05/12/2021

Slower is Better: Revisiting the Forgetting Mechanism in LSTM for Slower Information Decay

Sequential information contains short- to long-range dependencies; howev...
research
05/25/2019

Bivariate Beta LSTM

Long Short-Term Memory (LSTM) infers the long term dependency through a ...

Please sign up or login with your details

Forgot password? Click here to reset