HDAM: Heuristic Difference Attention Module for Convolutional Neural Networks

02/19/2022
by   Yu Xue, et al.
0

The attention mechanism is one of the most important priori knowledge to enhance convolutional neural networks. Most attention mechanisms are bound to the convolutional layer and use local or global contextual information to recalibrate the input. This is a popular attention strategy design method. Global contextual information helps the network to consider the overall distribution, while local contextual information is more general. The contextual information makes the network pay attention to the mean or maximum value of a particular receptive field. Different from the most attention mechanism, this article proposes a novel attention mechanism with the heuristic difference attention module, HDAM. HDAM's input recalibration is based on the difference between the local and global contextual information instead of the mean and maximum values. At the same time, to make different layers have a more suitable local receptive field size and increase the exibility of the local receptive field design, we use genetic algorithm to heuristically produce local receptive fields. First, HDAM extracts the mean value of the global and local receptive fields as the corresponding contextual information. Then the difference between the global and local contextual information is calculated. Finally HDAM uses this difference to recalibrate the input. In addition, we use the heuristic ability of genetic algorithm to search for the local receptive field size of each layer. Our experiments on CIFAR-10 and CIFAR-100 show that HDAM can use fewer parameters than other attention mechanisms to achieve higher accuracy. We implement HDAM with the Python library, Pytorch, and the code and models will be publicly available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/23/2022

Rega-Net:Retina Gabor Attention for Deep Convolutional Neural Networks

Extensive research works demonstrate that the attention mechanism in con...
research
09/13/2020

Attention Cube Network for Image Restoration

Recently, deep convolutional neural network (CNN) have been widely used ...
research
08/18/2016

Deeply-Supervised Recurrent Convolutional Neural Network for Saliency Detection

This paper proposes a novel saliency detection method by developing a de...
research
11/14/2014

Predictive Encoding of Contextual Relationships for Perceptual Inference, Interpolation and Prediction

We propose a new neurally-inspired model that can learn to encode the gl...
research
01/10/2019

Early recurrence enables figure border ownership

The face-vase illusion introduced by Rubin demonstrates how one can swit...
research
11/22/2017

Locally Smoothed Neural Networks

Convolutional Neural Networks (CNN) and the locally connected layer are ...
research
05/22/2019

AttentionRNN: A Structured Spatial Attention Mechanism

Visual attention mechanisms have proven to be integrally important const...

Please sign up or login with your details

Forgot password? Click here to reset