How important are activation functions in regression and classification? A survey, performance comparison, and future directions

by   Ameya D. Jagtap, et al.

Inspired by biological neurons, the activation functions play an essential part in the learning process of any artificial neural network commonly used in many real-world problems. Various activation functions have been proposed in the literature for classification as well as regression tasks. In this work, we survey the activation functions that have been employed in the past as well as the current state-of-the-art. In particular, we present various developments in activation functions over the years and the advantages as well as disadvantages or limitations of these activation functions. We also discuss classical (fixed) activation functions, including rectifier units, and adaptive activation functions. In addition to presenting the taxonomy of activation functions based on characterization, a taxonomy of activation functions based on applications is also presented. To this end, the systematic comparison of various fixed and adaptive activation functions is performed for classification data sets such as the MNIST, CIFAR-10, and CIFAR-100. In recent years, a physics-informed machine learning framework has emerged for solving problems related to scientific computations. To this purpose, we also discuss various requirements for activation functions that have been used in the physics-informed machine learning framework. Furthermore, various comparisons are made among different fixed and adaptive activation functions using various machine learning libraries such as TensorFlow, Pytorch, and JAX.


A survey on modern trainable activation functions

In the literature, there is a strong interest to identify and define act...

Unification of popular artificial neural network activation functions

We present a unified representation of the most popular neural network a...

Learning Specialized Activation Functions for Physics-informed Neural Networks

Physics-informed neural networks (PINNs) are known to suffer from optimi...

A survey on recently proposed activation functions for Deep Learning

Artificial neural networks (ANN), typically referred to as neural networ...

Physical Activation Functions (PAFs): An Approach for More Efficient Induction of Physics into Physics-Informed Neural Networks (PINNs)

In recent years, the gap between Deep Learning (DL) methods and analytic...

Performance Analysis of Open Source Machine Learning Frameworks for Various Parameters in Single-Threaded and Multi-Threaded Modes

The basic features of some of the most versatile and popular open source...

Formalising the Use of the Activation Function in Neural Inference

We investigate how activation functions can be used to describe neural f...

Please sign up or login with your details

Forgot password? Click here to reset