Structured second-order methods via natural gradient descent

07/22/2021
by   Wu Lin, et al.
0

In this paper, we propose new structured second-order methods and structured adaptive-gradient methods obtained by performing natural-gradient descent on structured parameter spaces. Natural-gradient descent is an attractive approach to design new algorithms in many settings such as gradient-free, adaptive-gradient, and second-order methods. Our structured methods not only enjoy a structural invariance but also admit a simple expression. Finally, we test the efficiency of our proposed methods on both deterministic non-convex problems and deep learning problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/18/2018

First-order and second-order variants of the gradient descent: a unified framework

In this paper, we provide an overview of first-order and second-order va...
research
02/15/2021

Tractable structured natural gradient descent using local parameterizations

Natural-gradient descent on structured parameter spaces (e.g., low-rank ...
research
01/16/2013

Revisiting Natural Gradient for Deep Networks

We evaluate natural gradient, an algorithm originally proposed in Amari ...
research
05/08/2023

ASDL: A Unified Interface for Gradient Preconditioning in PyTorch

Gradient preconditioning is a key technique to integrate the second-orde...
research
05/08/2012

The Natural Gradient by Analogy to Signal Whitening, and Recipes and Tricks for its Use

The natural gradient allows for more efficient gradient descent by remov...
research
09/07/2012

Learning Model-Based Sparsity via Projected Gradient Descent

Several convex formulation methods have been proposed previously for sta...
research
05/24/2021

2nd-order Updates with 1st-order Complexity

It has long been a goal to efficiently compute and use second order info...

Please sign up or login with your details

Forgot password? Click here to reset