Mish: A Self Regularized Non-Monotonic Neural Activation Function

08/23/2019
by   Diganta Misra, et al.
0

The concept of non-linearity in a Neural Network is introduced by an activation function which serves an integral role in the training and performance evaluation of the network. Over the years of theoretical research, many activation functions have been proposed, however, only a few are widely used in mostly all applications which include ReLU (Rectified Linear Unit), TanH (Tan Hyperbolic), Sigmoid, Leaky ReLU and Swish. In this work, a novel neural activation function called as Mish is proposed. The experiments show that Mish tends to work better than both ReLU and Swish along with other standard activation functions in many deep networks across challenging datasets. For instance, in Squeeze Excite Net- 18 for CIFAR 100 classification, the network with Mish had an increase in Top-1 test accuracy by 0.494 1.671 similarity to Swish along with providing a boost in performance and its simplicity in implementation makes it easier for researchers and developers to use Mish in their Neural Network Models.

READ FULL TEXT
research
11/08/2021

SMU: smooth activation function for deep networks using smoothing maximum technique

Deep learning researchers have a keen interest in proposing two new nove...
research
10/16/2017

Searching for Activation Functions

The choice of activation functions in deep networks has a significant ef...
research
08/18/2023

Capacity Bounds for Hyperbolic Neural Network Representations of Latent Tree Structures

We study the representation capacity of deep hyperbolic neural networks ...
research
02/02/2023

Physics Informed Piecewise Linear Neural Networks for Process Optimization

Constructing first-principles models is usually a challenging and time-c...
research
10/17/2022

Nish: A Novel Negative Stimulated Hybrid Activation Function

An activation function has a significant impact on the efficiency and ro...
research
08/21/2021

SERF: Towards better training of deep neural networks using log-Softplus ERror activation Function

Activation functions play a pivotal role in determining the training dyn...
research
05/05/2015

Empirical Evaluation of Rectified Activations in Convolutional Network

In this paper we investigate the performance of different types of recti...

Please sign up or login with your details

Forgot password? Click here to reset