Simple2Complex: Global Optimization by Gradient Descent

05/02/2016
by   Ming Li, et al.
0

A method named simple2complex for modeling and training deep neural networks is proposed. Simple2complex train deep neural networks by smoothly adding more and more layers to the shallow networks, as the learning procedure going on, the network is just like growing. Compared with learning by end2end, simple2complex is with less possibility trapping into local minimal, namely, owning ability for global optimization. Cifar10 is used for verifying the superiority of simple2complex.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset