Do RNN and LSTM have Long Memory?

06/06/2020
by   Jingyu Zhao, et al.
0

The LSTM network was proposed to overcome the difficulty in learning long-term dependence, and has made significant advancements in applications. With its success and drawbacks in mind, this paper raises the question - do RNN and LSTM have long memory? We answer it partially by proving that RNN and LSTM do not have long memory from a statistical perspective. A new definition for long memory networks is further introduced, and it requires the gradient to decay hyperbolically. To verify our theory, we convert RNN and LSTM into long memory networks by making a minimal modification, and their superiority is illustrated in modeling long-term dependence of various datasets.

READ FULL TEXT

page 7

page 15

page 16

page 17

page 18

page 19

research
04/18/2019

Language Modeling through Long Term Memory Network

Recurrent Neural Networks (RNN), Long Short-Term Memory Networks (LSTM),...
research
05/06/2016

LSTM with Working Memory

Previous RNN architectures have largely been superseded by LSTM, or "Lon...
research
02/24/2017

Analyzing and Exploiting NARX Recurrent Neural Networks for Long-Term Dependencies

Recurrent neural networks (RNNs) have achieved state-of-the-art performa...
research
02/14/2014

A Clockwork RNN

Sequence prediction and classification are ubiquitous and challenging pr...
research
06/16/2021

On the long-term learning ability of LSTM LMs

We inspect the long-term learning ability of Long Short-Term Memory lang...
research
12/11/2018

Learning What to Remember: Long-term Episodic Memory Networks for Learning from Streaming Data

Current generation of memory-augmented neural networks has limited scala...
research
10/27/2021

End-to-end LSTM based estimation of volcano event epicenter localization

In this paper, an end-to-end based LSTM scheme is proposed to address th...

Please sign up or login with your details

Forgot password? Click here to reset