Skip to main content
Log in

An Investigation on Semismooth Newton based Augmented Lagrangian Method for Image Restoration

  • Published:
Journal of Scientific Computing Aims and scope Submit manuscript

Abstract

The augmented Lagrangian method (also called as method of multipliers) is an important and powerful optimization method for lots of smooth or nonsmooth variational problems in modern signal processing, imaging and optimal control. However, one usually needs to solve a coupled and nonlinear system of equations, which is very challenging. In this paper, we propose several semismooth Newton methods to solve arising nonlinear subproblems for image restoration in finite dimensional spaces, which leads to several highly efficient and competitive algorithms for imaging processing. With the analysis of the metric subregularities of the corresponding functions, we give both the global convergence and local linear convergence rate for the proposed augmented Lagrangian methods with semismooth Newton solvers.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Data Availability

Enquiries about data availability should be directed to the authors.

References

  1. Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, New York (2011)

    Book  Google Scholar 

  2. Bertsekas, D.P.: Constrained Optimization and Lagrange Multiplier Methods. Academic Press, Paris (1982)

    MATH  Google Scholar 

  3. Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. J. Math. Imaging and Vis. 40(1), 120–145 (2011)

    Article  MathSciNet  Google Scholar 

  4. Clarke, F.H.: Optimization and Nonsmooth Analysis, Vol. 5, Classics in Applied Mathematics, SIAM, Philadelphia (1990)

  5. Clason, C.: Nonsmooth Analysis and Optimization, Lecture notes, arXiv:1708.04180, (2018)

  6. Cui, Y., Sun, D., Toh, K.: On the R-superlinear convergence of the KKT residues generated by the augmented Lagrangian method for convex composite conic programming. Math. Program., Ser. A, 178, 38–415 (2019). https://doi.org/10.1007/s10107-018-1300-6

    Article  MATH  Google Scholar 

  7. De Luca, T., Facchinei, F., Kanzow, C.: A semismooth equation approach to the solution of nonlinear complementarity problems. Math. Program. 75, 407–439 (1996)

    Article  MathSciNet  Google Scholar 

  8. Dontchev, A.L., Rockafellar, R.T.: Functions and Solution Mappings: A View from Variational Analysis, 2nd edn. Springer Science+Business Media, New York (2014)

    MATH  Google Scholar 

  9. Facchinei, F., Pang, J.: Finite-Dimensional Variational Inequalities and Complementarity Problems, vol. I. Springer-Verlag, New York Inc (2003)

    MATH  Google Scholar 

  10. Fortin, M., Glowinski, R. (eds.): Augmented Lagrangian Methods: Applications to the Solution of Boundary Value Problems. North-Holland, Amsterdam (1983)

  11. Glowinski, R., Osher, S., Yin, W. (eds.), Splitting Methods in Communication, Imaging, Science, and Engineering, Springer (2016)

  12. Hestenes, M.R.: Multiplier and gradient methods. J. Optim. Theory Appl. 4, 303–320 (1968)

    Article  MathSciNet  Google Scholar 

  13. Hintermüller, M., Kunisch, K.: Total bounded variation regularization as a bilaterally constrained optimization problem. SIAM J. Appl. Math 64(4), 1311–1333 (2004)

    Article  MathSciNet  Google Scholar 

  14. Hintermüller, M., Papafitsoros, K., Rautenberg, C.N., Sun, H.: Dualization and automatic distributed parameter selection of total generalized variation via bilevel optimization. Numer. Funct. Anal. Optim. 40(8), 887–932 (2022). https://doi.org/10.1080/01630563.2022.2069812

    Article  MathSciNet  MATH  Google Scholar 

  15. Hintermüller, M., Stadler, G.: A infeasible primal-dual algorithm for total bounded variaton-based inf-convolution-type image restoration. SIAM J. Sci. Comput. 28(1), 1–23 (2006)

    Article  MathSciNet  Google Scholar 

  16. Hoffman, A.J.: On approximate solutions of systems of linear inequalities. J. Research Nat. Bur. Standards 49, 263–265 (1952)

    Article  MathSciNet  Google Scholar 

  17. Klatte, D., Kummer, B.: Constrained minima and Lipschitzian penalties in metric spaces. SIAM J. Optim. 13(2), 619–633 (2002)

    Article  MathSciNet  Google Scholar 

  18. Klatte, D., Kummer, B.: Nonsmooth Equations in Optimization. Regularity, Calculus, Methods and Applications, Series Nonconvex Optimization and Its Applications, Vol. 60, Springer, Boston, MA (2002)

  19. Ito, K., Kunisch, K.: Lagrange Multiplier Approach to Variational Problems and Applications, Advances in design and control 15, Philadelphia, SIAM (2008)

  20. Ito, K., Kunisch, K.: An active set strategy based on the augmented Lagrangian formulation for image restoration. RAIRO, Math. Mod. and Num. Analysis 33(1), 1–21 (1999)

    Article  MathSciNet  Google Scholar 

  21. Leventhal, D.: Metric subregularity and the proximal point method. J. Math. Anal. Appl. 360(2009), 681–688 (2009)

    Article  MathSciNet  Google Scholar 

  22. Mifflin, R.: Semismooth and semiconvex functions in constrained optimization. SIAM J. Control Optim. 15(6), 959–972 (1977)

    Article  MathSciNet  Google Scholar 

  23. Li, X., Sun, D., Toh, C.: A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems. SIAM J. Optim. 28(1), 433–458 (2018)

    Article  MathSciNet  Google Scholar 

  24. Luque, F.J.: Asymptotic convergence analysis of the proximal point algorithm. SIAM J. Control Optim. 22(2), 277–293 (1984)

    Article  MathSciNet  Google Scholar 

  25. Powell, M.J.D.: A method for nonlinear constraints in minimization problems. In: Fletcher, R. (ed.) Optimization, pp. 283–298. Academic Press, New York (1968)

    Google Scholar 

  26. Reed, M., Simon, B.: Methods of Modern Mathematical Physics: Functional Analysis, Methods of Modern Math. Phys. 1, Academic Press, New York (1980)

  27. Robinson, S.M.: Some continuity properties of polyhedral multifunctions. In: König, H., Korte, B., Ritter, K. (eds.) Mathematical Programming at Oberwolfach, Math. Program. Stud., Springer, Berlin, Heidelberg, pp. 206–214 (1981)

  28. Rockafellar, R.T.: Monotone operators and the proximal point algorithm. SIAM J. Control Optim. 14(5), 877–898 (1976)

    Article  MathSciNet  Google Scholar 

  29. Rockafellar, R.T.: Augmented Lagrangians and applications of the proximal point algorithm in convex programming. Math. Oper. Res. 1(2), 97–116 (1976)

    Article  MathSciNet  Google Scholar 

  30. Rudin, L.I., Osher, S., Fatemi, E.: Nonlinear total variation based noise removal algorithms. Physica D. 60(1–4), 259–268 (1992)

    Article  MathSciNet  Google Scholar 

  31. Scholtes, S.: Introduction to Piecewise Differentiable Equations, Springer Briefs in Optimization, Springer, New York (2012)

  32. Stadler, G.: Semismooth Newton and augmented Lagrangian methods for a simplified friction problem. SIAM J. Optim. 15(1), 39–62 (2004)

    Article  MathSciNet  Google Scholar 

  33. Stadler, G.: Infinite-Dimensional Semi-Smooth Newton and Augmented Lagrangian Methods for Friction and Contact Problems in Elasticity, PhD thesis, University of Graz (2004)

  34. Sun, D., Han, J.: Newton and quasi-Newton methods for a class of nonsmooth equations and related problems. SIAM J. Optim. 7, 463–480 (1997)

    Article  MathSciNet  Google Scholar 

  35. Tomioka, R., Sugiyama, M.: Dual-augmented Lagrangian method for efficient sparse reconstruction. IEEE Signal Process. Lett. 16(12), 1067–1070 (2009)

    Article  Google Scholar 

  36. Ulbrich, M.: Semismooth Newton Methods for Variational Inequalities and Constrained Optimization Problems in Function Spaces, MOS-SIAM Series on Optimization (2011)

  37. Van der Vorst, H.A.: Iterative Krylov Methods for Large Linear Systems. Cambridge University Press, Cambridge (2003)

    Book  Google Scholar 

  38. Ye, J., Yuan, X., Zeng, S., Zhang, J.: Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems. Set-Valued Var. Anal. 29, 803–837 (2021). https://doi.org/10.1007/s11228-021-00591-3

    Article  MathSciNet  MATH  Google Scholar 

  39. Yu, P., Li, G., Pong, T.K.: Deducing Kurdyka-Lojasiewicz exponent via inf-projection. Found. Comput. Math. (2021). https://doi.org/10.1007/s10208-021-09528-6

  40. Zhang, F. (ed.): The Schur Complement and Its Applications, Numerical Methods and Algorithms 4, Springer US (2005)

  41. Zhang, Y., Zhang, N., Sun, D., Toh, K.: An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems. Mathematical Programming A 179, 223–263 (2020). https://doi.org/10.1007/s10107-018-1329-6

    Article  MathSciNet  MATH  Google Scholar 

  42. Zhao, X., Sun, D., Toh, K.: A Newton-CG augmented Lagrangian method for semidefinite programming. SIAM J. Optim. 20(4), 1737–1765 (2010)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The author acknowledges the constructive comments from the anonymous referee, which greatly improve the paper. The author also acknowledges the support of Beijing Natural Science Foundation No. Z210001 and NSF of China under Grant No.  11701563. The work originated during the author’s visit to Prof. Defeng Sun of the Hong Kong Polytechnic University in October 2018. The author is very grateful to Prof. Defeng Sun for introducing the framework on semismooth Newton based ALM developed by him and his collaborators and for his suggestions on the self-adjointness of the corresponding operators for Newton updates where CG can be employed. The author is also very grateful to Prof. Kim-Chuan Toh, Dr. Chao Ding, Dr. Xudong Li and Dr. Xinyuan Zhao for the discussion on the semismooth Newton based ALM. The author is also very grateful to Prof. Michael Hintermüller for the discussion on the primal-dual semismooth Newton method during the author’s visit to Weierstrass Institute for Applied Analysis and Stochastics (WIAS) supported by Alexander von Humboldt Foundation during 2017.

Funding

The authors have not disclosed any funding.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongpeng Sun.

Ethics declarations

Competing interests

The authors have not disclosed any competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sun, H. An Investigation on Semismooth Newton based Augmented Lagrangian Method for Image Restoration. J Sci Comput 92, 82 (2022). https://doi.org/10.1007/s10915-022-01907-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10915-022-01907-7

Keywords

Mathematics Subject Classification

Navigation