Calibrated BatchNorm: Improving Robustness Against Noisy Weights in Neural Networks

07/07/2020
by   Li-Huang Tsai, et al.
16

Analog computing hardware has gradually received more attention by the researchers for accelerating the neural network computations in recent years. However, the analog accelerators often suffer from the undesirable intrinsic noise caused by the physical components, making the neural networks challenging to achieve ordinary performance as on the digital ones. We suppose the performance drop of the noisy neural networks is due to the distribution shifts in the network activations. In this paper, we propose to recalculate the statistics of the batch normalization layers to calibrate the biased distributions during the inference phase. Without the need of knowing the attributes of the noise beforehand, our approach is able to align the distributions of the activations under variational noise inherent in the analog environments. In order to validate our assumptions, we conduct quantitative experiments and apply our methods on several computer vision tasks, including classification, object detection, and semantic segmentation. The results demonstrate the effectiveness of achieving noise-agnostic robust networks and progress the developments of the analog computing devices in the field of neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/14/2020

Noisy Machines: Understanding Noisy Neural Networks and Enhancing Robustness to Analog Hardware Errors Using Distillation

The success of deep learning has brought forth a wave of interest in com...
research
02/12/2021

Dynamic Precision Analog Computing for Neural Networks

Analog electronic and optical computing exhibit tremendous advantages ov...
research
07/17/2018

Training Recurrent Neural Networks against Noisy Computations during Inference

We explore the robustness of recurrent neural networks when the computat...
research
03/12/2021

The general aspects of noise in analogue hardware deep neural networks

Deep neural networks unlocked a vast range of new applications by solvin...
research
09/19/2021

On the Noise Stability and Robustness of Adversarially Trained Networks on NVM Crossbars

Applications based on Deep Neural Networks (DNNs) have grown exponential...
research
04/06/2023

Patch-aware Batch Normalization for Improving Cross-domain Robustness

Despite the significant success of deep learning in computer vision task...
research
08/02/2022

Unified Normalization for Accelerating and Stabilizing Transformers

Solid results from Transformers have made them prevailing architectures ...

Please sign up or login with your details

Forgot password? Click here to reset