One ticket to win them all: generalizing lottery ticket initializations across datasets and optimizers

06/06/2019
by   Ari S. Morcos, et al.
5

The success of lottery ticket initializations (Frankle and Carbin, 2019) suggests that small, sparsified networks can be trained so long as the network is initialized appropriately. Unfortunately, finding these "winning ticket" initializations is computationally expensive. One potential solution is to reuse the same winning tickets across a variety of datasets and optimizers. However, the generality of winning ticket initializations remains unclear. Here, we attempt to answer this question by generating winning tickets for one training configuration (optimizer and dataset) and evaluating their performance on another configuration. Perhaps surprisingly, we found that, within the natural images domain, winning ticket initializations generalized across a variety of datasets, including Fashion MNIST, SVHN, CIFAR-10/100, ImageNet, and Places365, often achieving performance close to that of winning tickets generated on the same dataset. Moreover, winning tickets generated using larger datasets consistently transferred better than those generated using smaller datasets. We also found that winning ticket initializations generalize across optimizers with high performance. These results suggest that winning ticket initializations contain inductive biases generic to neural networks more broadly which improve training across many settings and provide hope for the development of better initialization methods.

READ FULL TEXT

page 6

page 7

research
05/11/2020

On the Transferability of Winning Tickets in Non-Natural Image Datasets

We study the generalization properties of pruned neural networks that ar...
research
09/22/2022

A Closer Look at Learned Optimization: Stability, Robustness, and Inductive Biases

Learned optimizers – neural networks that are trained to act as optimize...
research
10/23/2022

Pushing the Efficiency Limit Using Structured Sparse Convolutions

Weight pruning is among the most popular approaches for compressing deep...
research
02/10/2020

Adversarial Filters of Dataset Biases

Large neural models have demonstrated human-level performance on languag...
research
03/30/2021

The Elastic Lottery Ticket Hypothesis

Lottery Ticket Hypothesis raises keen attention to identifying sparse tr...
research
05/16/2018

Spatial Transformer Introspective Neural Network

Natural images contain many variations such as illumination differences,...
research
11/07/2019

This dataset does not exist: training models from generated images

Current generative networks are increasingly proficient in generating hi...

Please sign up or login with your details

Forgot password? Click here to reset