Finding the Needle in the Haystack with Convolutions: on the benefits of architectural bias

06/16/2019
by   Stéphane d'Ascoli, et al.
0

Despite the phenomenal success of deep neural networks in a broad range of learning tasks, there is a lack of theory to understand the way they work. In particular, Convolutional Neural Networks (CNNs) are known to perform much better than Fully-Connected Networks (FCNs) on spatially structured data: the architectural structure of CNNs benefits from prior knowledge on the features of the data, for instance their translation invariance. The aim of this work is to understand this fact through the lens of dynamics in the loss landscape. We introduce a method that maps a CNN to its equivalent FCN (denoted as eFCN). Such an embedding enables the comparison of CNN and FCN training dynamics directly in the FCN space. We use this method to test a new training protocol, which consists in training a CNN, embedding it to FCN space at a certain 'switch time' t_w, then resuming the training in FCN space. We observe that for all switch times, the deviation from the CNN subspace is small, and the final performance reached by the eFCN is higher than that reachable by the standard FCN. More surprisingly, for some intermediate switch times, the eFCN even outperforms the CNN it stemmed from. The practical interest of our protocol is limited by the very large size of the highly sparse eFCN. However, it offers an interesting insight into the persistence of the architectural bias under the stochastic gradient dynamics even in the presence of a huge number of additional degrees of freedom. It shows the existence of some rare basins in the FCN space associated with very good generalization. These can be accessed thanks to the CNN prior, and are otherwise missed.

READ FULL TEXT

page 8

page 14

research
09/20/2019

Persian Signature Verification using Fully Convolutional Networks

Fully convolutional networks (FCNs) have been recently used for feature ...
research
04/27/2021

Sifting out the features by pruning: Are convolutional networks the winning lottery ticket of fully connected ones?

Pruning methods can considerably reduce the size of artificial neural ne...
research
05/05/2019

Embedding Structured Contour and Location Prior in Siamesed Fully Convolutional Networks for Road Detection

Road detection from the perspective of moving vehicles is a challenging ...
research
11/04/2017

Focal FCN: Towards Small Object Segmentation with Limited Training Data

Small object segmentation is a common task in medical image analysis. Tr...
research
10/11/2018

Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes

There is a previously identified equivalence between wide fully connecte...
research
01/20/2019

Visualized Insights into the Optimization Landscape of Fully Convolutional Networks

Many image processing tasks involve image-to-image mapping, which can be...

Please sign up or login with your details

Forgot password? Click here to reset