An incremental descent method for multi-objective optimization

by   I. F. D. Oliveira, et al.

Current state-of-the-art multi-objective optimization solvers, by computing gradients of all m objective functions per iteration, produce after k iterations a measure of proximity to critical conditions that is upper-bounded by O(1/√(k)) when the objective functions are assumed to have L-Lipschitz continuous gradients; i.e. they require O(m/ϵ^2) gradient and function computations to produce a measure of proximity to critical conditions bellow some target ϵ. We reduce this to O(1/ϵ^2) with a method that requires only a constant number of gradient and function computations per iteration; and thus, we obtain for the first time a multi-objective descent-type method with a query complexity cost that is unaffected by increasing values of m. For this, a brand new multi-objective descent direction is identified, which we name the central descent direction, and, an incremental approach is proposed. Robustness properties of the central descent direction are established, measures of proximity to critical conditions are derived, and, the incremental strategy for finding solutions to the multi-objective problem is shown to attain convergence properties unattained by previous methods. To the best of our knowledge, this is the first method to achieve this with no additional a-priori information on the structure of the problem, such as done by scalarizing techniques, and, with no pre-known information on the regularity of the objective functions other than Lipschitz continuity of the gradients.


page 9

page 11

page 12


Follow the bisector: a simple method for multi-objective optimization

This study presents a novel Equiangular Direction Method (EDM) to solve ...

Mitigating Gradient Bias in Multi-objective Learning: A Provably Convergent Stochastic Approach

Machine learning problems with multiple objective functions appear eithe...

Momentum-based Gradient Methods in Multi-objective Recommender Systems

Multi-objective gradient methods are becoming the standard for solving m...

Revisiting Subgradient Method: Complexity and Convergence Beyond Lipschitz Continuity

The subgradient method is one of the most fundamental algorithmic scheme...

A bird's eye view on Multi-Objective Optimization techniques in Relational Databases

Multi-objective optimization is the problem of optimizing simultaneously...

Complexity Measures for Multi-objective Symbolic Regression

Multi-objective symbolic regression has the advantage that while the acc...

A nonsmooth nonconvex descent algorithm

The paper presents a new descent algorithm for locally Lipschitz continu...

Please sign up or login with your details

Forgot password? Click here to reset