When classifying grammatical role, BERT doesn't care about word order... except when it matters

by   Isabel Papadimitriou, et al.

Because meaning can often be inferred from lexical semantics alone, word order is often a redundant cue in natural language. For example, the words chopped, chef, and onion are more likely used to convey "The chef chopped the onion," not "The onion chopped the chef." Recent work has shown large language models to be surprisingly word order invariant, but crucially has largely considered natural prototypical inputs, where compositional meaning mostly matches lexical expectations. To overcome this confound, we probe grammatical role representation in English BERT and GPT-2, on instances where lexical expectations are not sufficient, and word order knowledge is necessary for correct classification. Such non-prototypical instances are naturally occurring English sentences with inanimate subjects or animate objects, or sentences where we systematically swap the arguments to make sentences like "The onion chopped the chef". We find that, while early layer embeddings are largely lexical, word order is in fact crucial in defining the later-layer representations of words in semantically non-prototypical positions. Our experiments isolate the effect of word order on the contextualization process, and highlight how models use context in the uncommon, but critical, instances where it matters.


page 1

page 2

page 3

page 4


MULTISEM at SemEval-2020 Task 3: Fine-tuning BERT for Lexical Meaning

We present the MULTISEM systems submitted to SemEval 2020 Task 3: Graded...

Grammatical cues are largely, but not completely, redundant with word meanings in natural language

The combinatorial power of language has historically been argued to be e...

Putting words in context: LSTM language models and lexical ambiguity

In neural network models of language, words are commonly represented usi...

The Role of Verb Semantics in Hungarian Verb-Object Order

Hungarian is often referred to as a discourse-configurational language, ...

Can Peanuts Fall in Love with Distributional Semantics?

The context in which a sentence appears can drastically alter our expect...

Implanting Rational Knowledge into Distributed Representation at Morpheme Level

Previously, researchers paid no attention to the creation of unambiguous...

An Algorithm for Fuzzification of WordNets, Supported by a Mathematical Proof

WordNet-like Lexical Databases (WLDs) group English words into sets of s...

Please sign up or login with your details

Forgot password? Click here to reset