Stochastic mirror descent method for linear ill-posed problems in Banach spaces

07/14/2022
by   Qinian Jin, et al.
0

Consider linear ill-posed problems governed by the system A_i x = y_i for i =1, ⋯, p, where each A_i is a bounded linear operator from a Banach space X to a Hilbert space Y_i. In case p is huge, solving the problem by an iterative regularization method using the whole information at each iteration step can be very expensive, due to the huge amount of memory and excessive computational load per iteration. To solve such large-scale ill-posed systems efficiently, we develop in this paper a stochastic mirror descent method which uses only a small portion of equations randomly selected at each iteration steps and incorporates convex regularization terms into the algorithm design. Therefore, our method scales very well with the problem size and has the capability of capturing features of sought solutions. The convergence property of the method depends crucially on the choice of step-sizes. We consider various rules for choosing step-sizes and obtain convergence results under a priori early stopping rules. In particular, by incorporating the spirit of the discrepancy principle we propose a choice rule of step-sizes which can efficiently suppress the oscillations in iterates and reduce the effect of semi-convergence. Furthermore, we establish an order optimal convergence rate result when the sought solution satisfies a benchmark source condition. Various numerical simulations are reported to test the performance of the method.

READ FULL TEXT
research
09/13/2022

Dual gradient flow for solving linear ill-posed problems in Banach spaces

We consider determining the -minimizing solution of ill-posed problem A ...
research
06/15/2022

Convergence rates of a dual gradient method for constrained linear ill-posed problems

In this paper we consider a dual gradient method for solving linear ill-...
research
01/20/2021

Optimal-order convergence of Nesterov acceleration for linear ill-posed problems

We show that Nesterov acceleration is an optimal-order iterative regular...
research
11/26/2022

Dual gradient method for ill-posed problems using multiple repeated measurement data

We consider determining -minimizing solutions of linear ill-posed proble...
research
06/06/2023

A rational conjugate gradient method for linear ill-conditioned problems

We consider linear ill-conditioned operator equations in a Hilbert space...
research
11/11/2020

Range-relaxed criteria for choosing the Lagrange multipliers in the Levenberg-Marquardt method

In this article we propose a novel strategy for choosing the Lagrange mu...
research
10/30/2021

Convergence and Semi-convergence of a class of constrained block iterative methods

In this paper, we analyze the convergence projected non-stationary bloc...

Please sign up or login with your details

Forgot password? Click here to reset