ASU-CNN: An Efficient Deep Architecture for Image Classification and Feature Visualizations

05/28/2023
by   Jamshaid Ul Rahman, et al.
0

Activation functions play a decisive role in determining the capacity of Deep Neural Networks as they enable neural networks to capture inherent nonlinearities present in data fed to them. The prior research on activation functions primarily focused on the utility of monotonic or non-oscillatory functions, until Growing Cosine Unit broke the taboo for a number of applications. In this paper, a Convolutional Neural Network model named as ASU-CNN is proposed which utilizes recently designed activation function ASU across its layers. The effect of this non-monotonic and oscillatory function is inspected through feature map visualizations from different convolutional layers. The optimization of proposed network is offered by Adam with a fine-tuned adjustment of learning rate. The network achieved promising results on both training and testing data for the classification of CIFAR-10. The experimental results affirm the computational feasibility and efficacy of the proposed model for performing tasks related to the field of computer vision.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset