Towards Robust Low-Resource Fine-Tuning with Multi-View Compressed Representations

by   Linlin Liu, et al.
Alibaba Group
Nanyang Technological University

Due to the huge amount of parameters, fine-tuning of pretrained language models (PLMs) is prone to overfitting in the low resource scenarios. In this work, we present a novel method that operates on the hidden representations of a PLM to reduce overfitting. During fine-tuning, our method inserts random autoencoders between the hidden layers of a PLM, which transform activations from the previous layers into a multi-view compressed representation before feeding it into the upper layers. The autoencoders are plugged out after fine-tuning, so our method does not add extra parameters or increase computation cost during inference. Our method demonstrates promising performance improvement across a wide range of sequence- and token-level low-resource NLP tasks.


page 2

page 8

page 13


Variational Information Bottleneck for Effective Low-Resource Fine-Tuning

While large-scale pretrained language models have obtained impressive re...

Prototypical Fine-tuning: Towards Robust Performance Under Varying Data Sizes

In this paper, we move towards combining large parametric models with no...

Parameter-Efficient Language Model Tuning with Active Learning in Low-Resource Settings

Pre-trained language models (PLMs) have ignited a surge in demand for ef...

Quadapter: Adapter for GPT-2 Quantization

Transformer language models such as GPT-2 are difficult to quantize beca...

AdapterEM: Pre-trained Language Model Adaptation for Generalized Entity Matching using Adapter-tuning

Entity Matching (EM) involves identifying different data representations...

Petals: Collaborative Inference and Fine-tuning of Large Models

Many NLP tasks benefit from using large language models (LLMs) that ofte...

Low-Resource Multi-Granularity Academic Function Recognition Based on Multiple Prompt Knowledge

Fine-tuning pre-trained language models (PLMs), e.g., SciBERT, generally...

Please sign up or login with your details

Forgot password? Click here to reset