BigGreen at SemEval-2021 Task 1: Lexical Complexity Prediction with Assembly Models

04/19/2021
by   Aadil Islam, et al.
0

This paper describes a system submitted by team BigGreen to LCP 2021 for predicting the lexical complexity of English words in a given context. We assemble a feature engineering-based model with a deep neural network model founded on BERT. While BERT itself performs competitively, our feature engineering-based model helps in extreme cases, eg. separating instances of easy and neutral difficulty. Our handcrafted features comprise a breadth of lexical, semantic, syntactic, and novel phonological measures. Visualizations of BERT attention maps offer insight into potential features that Transformers models may learn when fine-tuned for lexical complexity prediction. Our ensembled predictions score reasonably well for the single word subtask, and we demonstrate how they can be harnessed to perform well on the multi word expression subtask too.

READ FULL TEXT
research
10/06/2021

Alejandro Mosquera at SemEval-2021 Task 1: Exploring Sentence and Word Features for Lexical Complexity Prediction

This paper revisits feature engineering approaches for predicting the co...
research
05/20/2021

LAST at SemEval-2021 Task 1: Improving Multi-Word Complexity Prediction Using Bigram Association Measures

This paper describes the system developed by the Laboratoire d'analyse s...
research
05/19/2021

Combining GCN and Transformer for Chinese Grammatical Error Detection

This paper describes our system at NLPTEA-2020 Task: Chinese Grammatical...
research
04/02/2021

IITK@LCP at SemEval 2021 Task 1: Classification for Lexical Complexity Regression Task

This paper describes our contribution to SemEval 2021 Task 1: Lexical Co...
research
05/18/2021

LCP-RIT at SemEval-2021 Task 1: Exploring Linguistic Features for Lexical Complexity Prediction

This paper describes team LCP-RIT's submission to the SemEval-2021 Task ...
research
02/06/2023

Controllable Lexical Simplification for English

Fine-tuning Transformer-based approaches have recently shown exciting re...
research
10/06/2020

Exploring BERT's Sensitivity to Lexical Cues using Tests from Semantic Priming

Models trained to estimate word probabilities in context have become ubi...

Please sign up or login with your details

Forgot password? Click here to reset