Gated recurrent units viewed through the lens of continuous time dynamical systems
Gated recurrent units (GRUs) are specialized memory elements for building recurrent neural networks. Despite their incredible success in natural language, speech, and video processing, little is understood about the specific dynamics representable in a GRU network, along with the constraints these dynamics impose when generalizing a specific task. As a result, it is difficult to know a priori how successful a GRU network will perform on a given task. Using a continuous time analysis, we gain intuition on the inner workings of GRU networks. We restrict our presentation to low dimensions to allow for a comprehensive visualization. We found a surprisingly rich repertoire of dynamical features that includes stable limit cycles (nonlinear oscillations), multi-stable dynamics with various topologies, and homoclinic orbits. We contextualize the usefulness of the different kinds of dynamics and experimentally test their existence.
READ FULL TEXT