Latent Universal Task-Specific BERT

05/16/2019
by   Alon Rozental, et al.
0

This paper describes a language representation model which combines the Bidirectional Encoder Representations from Transformers (BERT) learning mechanism described in Devlin et al. (2018) with a generalization of the Universal Transformer model described in Dehghani et al. (2018). We further improve this model by adding a latent variable that represents the persona and topics of interests of the writer for each training example. We also describe a simple method to improve the usefulness of our language representation for solving problems in a specific domain at the expense of its ability to generalize to other fields. Finally, we release a pre-trained language representation model for social texts that was trained on 100 million tweets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2020

What the [MASK]? Making Sense of Language-Specific BERT Models

Recently, Natural Language Processing (NLP) has witnessed an impressive ...
research
02/24/2021

Re-Evaluating GermEval17 Using German Pre-Trained Language Models

The lack of a commonly used benchmark data set (collection) such as (Sup...
research
07/29/2019

Machine Translation Evaluation with BERT Regressor

We introduce the metric using BERT (Bidirectional Encoder Representation...
research
02/27/2020

A Primer in BERTology: What we know about how BERT works

Transformer-based models are now widely used in NLP, but we still do not...
research
11/24/2020

Tackling Domain-Specific Winograd Schemas with Knowledge-Based Reasoning and Machine Learning

The Winograd Schema Challenge (WSC) is a common-sense reasoning task tha...
research
01/31/2019

Multi-Task Deep Neural Networks for Natural Language Understanding

In this paper, we present a Multi-Task Deep Neural Network (MT-DNN) for ...
research
02/13/2022

ET-BERT: A Contextualized Datagram Representation with Pre-training Transformers for Encrypted Traffic Classification

Encrypted traffic classification requires discriminative and robust traf...

Please sign up or login with your details

Forgot password? Click here to reset