A new analysis of two proximal forward-backward algorithms with and without errors
Many problems in science and engineering involve, as part of their solution process, the consideration of a minimization of a composite function F=f+g where f is smooth, g possibly not, and both are convex. The talk will discuss, in generalized settings, two proximal forward-backward algorithms aiming at solving this task. The first is FISTA, a popular accelerated method suggested by Beck-Teboulle. We consider it in Hilbert spaces and allow error terms which satisfy a certain decay rate. The notion of inexactness we discuss seems to be simpler than the ones discussed in related works but, interestingly, very similar decay rates of the error terms yield very similar non-asymptotic convergence rates (in the function values). Our derivation also sheds some light on the somewhat mysterious origin of some relevant parameters. In the second method, which is non-accelerated, the setting is closed and convex subsets of reflexive Banach spaces where the proximal operation is based on a (strongly convex) Bregman divergence. Now, in contrast to previous works, the gradient of f may not be globally Lipschitz continuous. Under certain assumptions a non-asymptotic rate of convergence is established, as well as weak convergence of the whole sequence.
This is a joint work with Alvaro De Pierro
תאריך עדכון אחרון : 09/03/2015