Topics in Convex Optimisation (Lent 2022)
Lecturer: Hamza Fawzi
Mon-Wed 10AM - MR14
The following lecture notes are rough and were not proofread, so they may contain mistakes, typos, ...
- Lecture 1: Introduction
- Lecture 2: Review of convexity
- Lecture 3: Gradient method
- Lecture 4: Fast gradient method
- Lecture 5: Subgradients
- Lecture 6: Subgradient method
- Lecture 7: Constrained optimization and duality
- Lecture 8: Duality (continued) and KKT conditions
- Lecture 9: Projections and projected (sub)gradient methods
- Lecture 10: Proximal methods
- Lecture 11: Bregman proximal methods
- Lecture 12: Dual methods
- Lecture 13: Augmented Lagrangian, ADMM
- Lecture 14: Douglas-Rachford
- Lecture 15: Newton's method
- Lecture 15: Newton's method (continued)
Exercise sheets
References
- S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press.
- Yurii Nesterov, Introductory Lectures on Convex Optimization, Springer.
- Lieven Vandenberghe, Lecture slides for EE236C at UCLA.
- Sébastien Bubeck, Convex Optimization: Algorithms and Complexity
- G. Gordon and R. Tibshirani, Lecture notes for an optimization course at CMU
- N. Parikh and S. Boyd, Proximal algorithms. [A monograph about proximal operators and algorithms]
- S. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein, Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
. [A monograph about ADMM and its applications in distributed optimization and statistical learning]
- J. Renegar, A Mathematical View of Interior Point Methods for Convex Optimization. [A monograph about path-following interior-point methods using self-concordant functions]
- R.T. Rockafellar, Convex Analysis, Princeton Mathematical Series