Massive Choice, Ample Tasks (MaChAmp):A Toolkit for Multi-task Learning in NLP

05/29/2020
by   Rob van der Goot, et al.
0

Transfer learning, particularly approaches that combine multi-task learning with pre-trained contextualized embeddings and fine-tuning, have advanced the field of Natural Language Processing tremendously in recent years. In this paper we present MaChAmp, a toolkit for easy use of fine-tuning BERT-like models in multi-task settings. The benefits of MaChAmp are its flexible configuration options, and the support of a variety of NLP tasks in a uniform toolkit, from text classification to sequence labeling and dependency parsing.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset