Influence Paths for Characterizing Subject-Verb Number Agreement in LSTM Language Models

by   Kaiji Lu, et al.

LSTM-based recurrent neural networks are the state-of-the-art for many natural language processing (NLP) tasks. Despite their performance, it is unclear whether, or how, LSTMs learn structural features of natural languages such as subject-verb number agreement in English. Lacking this understanding, the generality of LSTM performance on this task and their suitability for related tasks remains uncertain. Further, errors cannot be properly attributed to a lack of structural capability, training data omissions, or other exceptional faults. We introduce *influence paths*, a causal account of structural properties as carried by paths across gates and neurons of a recurrent neural network. The approach refines the notion of influence (the subject's grammatical number has influence on the grammatical number of the subsequent verb) into a set of gate or neuron-level paths. The set localizes and segments the concept (e.g., subject-verb agreement), its constituent elements (e.g., the subject), and related or interfering elements (e.g., attractors). We exemplify the methodology on a widely-studied multi-layer LSTM language model, demonstrating its accounting for subject-verb number agreement. The results offer both a finer and a more complete view of an LSTM's handling of this structural aspect of the English language than prior results based on diagnostic classifiers and ablation.


page 1

page 2

page 3

page 4


Under the Hood: Using Diagnostic Classifiers to Investigate and Improve how Language Models Track Agreement Information

How do neural language models keep track of number agreement between sub...

LSTMs Exploit Linguistic Attributes of Data

While recurrent neural networks have found success in a variety of natur...

Can LSTM Learn to Capture Agreement? The Case of Basque

Sequential neural networks models are powerful tools in a variety of Nat...

Abstracting Influence Paths for Explaining (Contextualization of) BERT Models

While "attention is all you need" may be proving true, we do not yet kno...

Structural Supervision Improves Learning of Non-Local Grammatical Dependencies

State-of-the-art LSTM language models trained on large corpora learn seq...

Can RNNs trained on harder subject-verb agreement instances still perform well on easier ones?

The main subject and the associated verb in English must agree in gramma...

Can RNNs learn Recursive Nested Subject-Verb Agreements?

One of the fundamental principles of contemporary linguistics states tha...

Please sign up or login with your details

Forgot password? Click here to reset