A Dynamic Head Importance Computation Mechanism for Neural Machine Translation

08/03/2021
by   Akshay Goindani, et al.
0

Multiple parallel attention mechanisms that use multiple attention heads facilitate greater performance of the Transformer model for various applications e.g., Neural Machine Translation (NMT), text classification. In multi-head attention mechanism, different heads attend to different parts of the input. However, the limitation is that multiple heads might attend to the same part of the input, resulting in multiple heads being redundant. Thus, the model resources are under-utilized. One approach to avoid this is to prune least important heads based on certain importance score. In this work, we focus on designing a Dynamic Head Importance Computation Mechanism (DHICM) to dynamically calculate the importance of a head with respect to the input. Our insight is to design an additional attention layer together with multi-head attention, and utilize the outputs of the multi-head attention along with the input, to compute the importance for each head. Additionally, we add an extra loss function to prevent the model from assigning same score to all heads, to identify more important heads and improvise performance. We analyzed performance of DHICM for NMT with different languages. Experiments on different datasets show that DHICM outperforms traditional Transformer-based approach by large margin, especially, when less training data is available.

READ FULL TEXT

page 7

page 8

research
02/24/2020

Fixed Encoder Self-Attention Patterns in Transformer-Based Machine Translation

Transformer-based models have brought a radical change to neural machine...
research
04/28/2020

Scheduled DropHead: A Regularization Method for Transformer Models

In this paper, we introduce DropHead, a structured dropout method specif...
research
12/22/2020

Multi-Head Self-Attention with Role-Guided Masks

The state of the art in learning meaningful semantic representations of ...
research
05/25/2019

Are Sixteen Heads Really Better than One?

Attention is a powerful and ubiquitous mechanism for allowing neural mod...
research
08/10/2021

Differentiable Subset Pruning of Transformer Heads

Multi-head attention, a collection of several attention mechanisms that ...
research
03/09/2019

Combining 3D Morphable Models: A Large scale Face-and-Head Model

Three-dimensional Morphable Models (3DMMs) are powerful statistical tool...
research
09/21/2020

Alleviating the Inequality of Attention Heads for Neural Machine Translation

Recent studies show that the attention heads in Transformer are not equa...

Please sign up or login with your details

Forgot password? Click here to reset