Eliminating all bad Local Minima from Loss Landscapes without even adding an Extra Unit

01/12/2019
by   Jascha Sohl-Dickstein, et al.
0

Recent work has noted that all bad local minima can be removed from neural network loss landscapes, by adding a single unit with a particular parameterization. We show that the core technique from these papers can be used to remove all bad local minima from any loss landscape, so long as the global minimum has a loss of zero. This procedure does not require the addition of auxiliary units, or even that the loss be associated with a neural network. The method of action involves all bad local minima being converted into bad (non-local) minima at infinity in terms of auxiliary parameters.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset