Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods

03/14/2017
by   Tianxiao Sun, et al.
0

We study the smooth structure of convex functions by generalizing a powerful concept so-called self-concordance introduced by Nesterov and Nemirovskii in the early 1990s to a broader class of convex functions, which we call generalized self-concordant functions. This notion allows us to develop a unified framework for designing Newton-type methods to solve convex optimiza- tion problems. The proposed theory provides a mathematical tool to analyze both local and global convergence of Newton-type methods without imposing unverifiable assumptions as long as the un- derlying functionals fall into our generalized self-concordant function class. First, we introduce the class of generalized self-concordant functions, which covers standard self-concordant functions as a special case. Next, we establish several properties and key estimates of this function class, which can be used to design numerical methods. Then, we apply this theory to develop several Newton-type methods for solving a class of smooth convex optimization problems involving the generalized self- concordant functions. We provide an explicit step-size for the damped-step Newton-type scheme which can guarantee a global convergence without performing any globalization strategy. We also prove a local quadratic convergence of this method and its full-step variant without requiring the Lipschitz continuity of the objective Hessian. Then, we extend our result to develop proximal Newton-type methods for a class of composite convex minimization problems involving generalized self-concordant functions. We also achieve both global and local convergence without additional assumption. Finally, we verify our theoretical results via several numerical examples, and compare them with existing methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/28/2023

Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method

We study the composite convex optimization problems with a Quasi-Self-Co...
research
07/01/2016

Randomized block proximal damped Newton method for composite self-concordant minimization

In this paper we consider the composite self-concordant (CSC) minimizati...
research
09/04/2023

Self-concordant Smoothing for Convex Composite Optimization

We introduce the notion of self-concordant smoothing for minimizing the ...
research
03/26/2021

Second order semi-smooth Proximal Newton methods in Hilbert spaces

We develop a globalized Proximal Newton method for composite and possibl...
research
02/04/2015

Composite convex minimization involving self-concordant-like cost functions

The self-concordant-like property of a smooth convex function is a new a...
research
08/11/2022

Super-Universal Regularized Newton Method

We analyze the performance of a variant of Newton method with quadratic ...
research
05/13/2014

Scalable sparse covariance estimation via self-concordance

We consider the class of convex minimization problems, composed of a sel...

Please sign up or login with your details

Forgot password? Click here to reset