Computational & Applied Math Seminar

Relating lp regularization and reweighted l1 regularization

  • Speaker: Hao Wang (ShanghaiTech University)

  • Time: Aug 24, 2022, 11:00-12:00

  • Location: Tencent Meeting ID 519-500-493

Abstract

The iteratively reweighted l1 algorithm is a widely used method for solving various regularization problems, which generally minimize a differentiable loss function combined with a convex/nonconvex regularizer to induce sparsity in the solution. However, the convergence and the complexity of iteratively reweighted l1 algorithms is generally difficult to analyze, especially for non-Lipschitz differentiable regularizers such as lp norm regularization with 0 < p < 1. In this paper, we propose, analyze and test a reweighted l1 algorithm combined with the extrapolation technique under the assumption of Kurdyka-Lojasiewicz (KL) property on the proximal function of the perturbed objective. Our method does not require the Lipschitz differentiability on the regularizers nor the smoothing parameters in the weights bounded away from 0. We show the proposed algorithm converges uniquely to a stationary point of the regularization problem and has local linear convergence for KL exponent at most 1/2 and local sublinear convergence for KL exponent greater than 1/2. We also provide results on calculating the KL exponents and discuss the cases when the KL exponent is at most 1/2. Numerical experiments show the efficiency of our proposed method.

Baidu
sogou