近期活动

Joint Seminars

On the global linear convergence of the ADM

Prof. Yin Wotao, Rice University
Thu, 2012-06-28 14:00 - 15:00
520 Pao Yue-Kong Library

The formulation, min f(x) + g(y), subject to Ax + By = b, arises in many application areas such as PDE computation, signal processing, imaging and image processing, statistics, and machine learning, either naturally or after variable splitting. In many common problems, one of the two objective functions is strongly convex and has Lipschitz continuous gradient. On this kind of problem, a very effective approach is the alternating direction method of multipliers (ADM), , which solves a sequence of f/g-decoupled subproblems.

However, its effectiveness has not been matched by a provably fast rate of convergence; only sublinear rates such as O(1/k) and O(1/k^2) were recently known in the literature. This talk shows that global linear convergence can be guaranteed given the above strong convexity and Lipschitz gradient conditions, along with certain rank assumptions on A and B. In addition, global linear convergence is also established for inexact ADMs in which the subproblems are solved faster and less exactly in certain manners. As an application of these results, the very common regularization model min f(By-b) + g(y) with a strongly convex function f, Lipschitz grad f, and an arbitrary convex function g can be solved by a modified ADM at a globally linear rate.