On Training Recurrent Networks with Truncated Backpropagation Through Time in Speech Recognition

by   Hao Tang, et al.

Recurrent neural networks have been the dominant models for many speech and language processing tasks. However, we understand little about the behavior and the class of functions recurrent networks can realize. Moreover, the heuristics used during training complicate the analyses. In this paper, we study recurrent networks' ability to learn long-term dependency in the context of speech recognition. We consider two decoding approaches, online and batch decoding, and show the classes of functions to which the decoding approaches correspond. We then draw a connection between batch decoding and a popular training approach for recurrent networks, truncated backpropagation through time. Changing the decoding approach restricts the amount of past history recurrent networks can use for prediction, allowing us to analyze their ability to remember. Empirically, we utilize long-term dependency in subphonetic states, phonemes, and words, and show how the design decisions, such as the decoding approach, lookahead, context frames, and consecutive prediction, characterize the behavior of recurrent networks. Finally, we draw a connection between Markov processes and vanishing gradients. These results have implications for studying the long-term dependency in speech data and how these properties are learned by recurrent networks.


page 1

page 2

page 3

page 4


Character-Level Incremental Speech Recognition with Recurrent Neural Networks

In real-time speech recognition applications, the latency is an importan...

Decoupling Hierarchical Recurrent Neural Networks With Locally Computable Losses

Learning long-term dependencies is a key long-standing challenge of recu...

Feedforward Sequential Memory Networks: A New Structure to Learn Long-term Dependency

In this paper, we propose a novel neural network structure, namely feedf...

Backpropagation Through Time For Networks With Long-Term Dependencies

Backpropagation through time (BPTT) is a technique of updating tuned par...

A comprehensive study of batch construction strategies for recurrent neural networks in MXNet

In this work we compare different batch construction methods for mini-ba...

Learning Longer-term Dependencies in RNNs with Auxiliary Losses

Despite recent advances in training recurrent neural networks (RNNs), ca...

Speech Dereverberation with Context-aware Recurrent Neural Networks

In this paper, we propose a model to perform speech dereverberation by e...

Please sign up or login with your details

Forgot password? Click here to reset