AfroLM: A Self-Active Learning-based Multilingual Pretrained Language Model for 23 African Languages

by   Bonaventure F. P. Dossou, et al.

In recent years, multilingual pre-trained language models have gained prominence due to their remarkable performance on numerous downstream Natural Language Processing tasks (NLP). However, pre-training these large multilingual language models requires a lot of training data, which is not available for African Languages. Active learning is a semi-supervised learning algorithm, in which a model consistently and dynamically learns to identify the most beneficial samples to train itself on, in order to achieve better optimization and performance on downstream tasks. Furthermore, active learning effectively and practically addresses real-world data scarcity. Despite all its benefits, active learning, in the context of NLP and especially multilingual language models pretraining, has received little consideration. In this paper, we present AfroLM, a multilingual language model pretrained from scratch on 23 African languages (the largest effort to date) using our novel self-active learning framework. Pretrained on a dataset significantly (14x) smaller than existing baselines, AfroLM outperforms many multilingual pretrained language models (AfriBERTa, XLMR-base, mBERT) on various NLP downstream tasks (NER, text classification, and sentiment analysis). Additional out-of-domain sentiment analysis experiments show that AfroLM is able to generalize well across various domains. We release the code source, and our datasets used in our framework at


MiLMo:Minority Multilingual Pre-trained Language Model

Pre-trained language models are trained on large-scale unsupervised data...

Fortunately, Discourse Markers Can Enhance Language Models for Sentiment Analysis

In recent years, pretrained language models have revolutionized the NLP ...

ClimateBert: A Pretrained Language Model for Climate-Related Text

Over the recent years, large pretrained language models (LM) have revolu...

Active Learning of Non-semantic Speech Tasks with Pretrained Models

Pretraining neural networks with massive unlabeled datasets has become p...

FAMIE: A Fast Active Learning Framework for Multilingual Information Extraction

This paper presents FAMIE, a comprehensive and efficient active learning...

Understand customer reviews with less data and in short time: pretrained language representation and active learning

In this paper, we address customer review understanding problems by usin...

ALLWAS: Active Learning on Language models in WASserstein space

Active learning has emerged as a standard paradigm in areas with scarcit...

Please sign up or login with your details

Forgot password? Click here to reset