Efficient Design of Neural Networks with Random Weights

08/24/2020
by   Ajay M. Patrikar, et al.
0

Single layer feedforward networks with random weights are known for their non-iterative and fast training algorithms and are successful in a variety of classification and regression problems. A major drawback of these networks is that they require a large number of hidden units. In this paper, we propose a technique to reduce the number of hidden units substantially without affecting the accuracy of the networks significantly. We introduce the concept of primary and secondary hidden units. The weights for the primary hidden units are chosen randomly while the secondary hidden units are derived using pairwise combinations of the primary hidden units. Using this technique, we show that the number of hidden units can be reduced by at least one order of magnitude. We experimentally show that this technique leads to significant drop in computations at inference time and has only a minor impact on network accuracy. A huge reduction in computations is possible if slightly lower accuracy is acceptable.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2020

Multi-Activation Hidden Units for Neural Networks with Random Weights

Single layer feedforward networks with random weights are successful in ...
research
03/24/2015

Universal Approximation of Markov Kernels by Shallow Stochastic Feedforward Networks

We establish upper bounds for the minimal number of hidden units for whi...
research
02/16/2020

A closer look at the approximation capabilities of neural networks

The universal approximation theorem, in one of its most general versions...
research
09/23/2010

A Constructive Algorithm for Feedforward Neural Networks for Medical Diagnostic Reasoning

This research is to search for alternatives to the resolution of complex...
research
08/22/2017

Learning Combinations of Sigmoids Through Gradient Estimation

We develop a new approach to learn the parameters of regression models w...
research
06/29/2019

Dissecting Pruned Neural Networks

Pruning is a standard technique for removing unnecessary structure from ...
research
09/25/2010

Pattern Classification using Simplified Neural Networks

In recent years, many neural network models have been proposed for patte...

Please sign up or login with your details

Forgot password? Click here to reset