Much of the recent discourse within the NLP research community has been
...
Over the past decade, machine learning has revolutionized computers' abi...
As language models grow ever larger, the need for large-scale high-quali...
ROOTS is a 1.6TB multilingual text corpus developed for the training of
...
Two of the most fundamental challenges in Natural Language Understanding...
Transformer-based language models are known to display anisotropic behav...
The recent emergence and adoption of Machine Learning technology, and
sp...
Both scientific progress and individual researcher careers depend on the...
Much of recent progress in NLU was shown to be due to models' learning
d...
A key part of the NLP ethics movement is responsible use of data, but ex...
Alongside huge volumes of research on deep learning models in NLP in the...
A myriad of explainability methods have been proposed in recent years, b...
NLP community is currently investing a lot more research and resources i...
Multiple studies have shown that BERT is remarkably robust to pruning, y...
Peer review is our best tool for judging the quality of conference
submi...
Much of the recent success in NLP is due to the large Transformer-based
...
Transformer-based models are now widely used in NLP, but we still do not...
We present NarrativeTime, a new timeline-based annotation scheme for tem...
BERT-based architectures currently give state-of-the-art performance on ...
In this paper, we present a method for adversarial decomposition of text...