AUBER: Automated BERT Regularization

09/30/2020
by   Hyun Dong Lee, et al.
12

How can we effectively regularize BERT? Although BERT proves its effectiveness in various downstream natural language processing tasks, it often overfits when there are only a small number of training instances. A promising direction to regularize BERT is based on pruning its attention heads based on a proxy score for head importance. However, heuristic-based methods are usually suboptimal since they predetermine the order by which attention heads are pruned. In order to overcome such a limitation, we propose AUBER, an effective regularization method that leverages reinforcement learning to automatically prune attention heads from BERT. Instead of depending on heuristics or rule-based policies, AUBER learns a pruning policy that determines which attention heads should or should not be pruned for regularization. Experimental results show that AUBER outperforms existing pruning methods by achieving up to 10 the effectiveness of our design choices for AUBER.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2022

An Automatic and Efficient BERT Pruning for Edge AI Systems

With the yearning for deep learning democratization, there are increasin...
research
11/20/2018

Structured Pruning for Efficient ConvNets via Incremental Regularization

Parameter pruning is a promising approach for CNN compression and accele...
research
06/30/2022

The Topological BERT: Transforming Attention into Topology for Natural Language Processing

In recent years, the introduction of the Transformer models sparked a re...
research
02/19/2020

Compressing BERT: Studying the Effects of Weight Pruning on Transfer Learning

Universal feature extractors, such as BERT for natural language processi...
research
10/12/2022

GMP*: Well-Tuned Global Magnitude Pruning Can Outperform Most BERT-Pruning Methods

We revisit the performance of the classic gradual magnitude pruning (GMP...
research
05/14/2021

BERT Busters: Outlier LayerNorm Dimensions that Disrupt BERT

Multiple studies have shown that BERT is remarkably robust to pruning, y...

Please sign up or login with your details

Forgot password? Click here to reset