Goldilocks Neural Networks
We introduce the new "Goldilocks" class of activation functions, which non-linearly deform the input signal only locally when the input signal is in the appropriate range. The small local deformation of the signal enables better understanding of how and why the signal is transformed through the layers. Numerical results on CIFAR-10 and CIFAR-100 data sets show that Goldilocks networks perform comparably to SELU and RELU, while introducing tractability of data deformation through the layers.
READ FULL TEXT