Tue-Thu 9AM - MR9

The following lecture notes are rough and were not proofread, so they may contain mistakes, typos, ...

- Lecture 1: Introduction
- Lecture 2: Review of convexity
- Lecture 3: Smoothness and strong-convexity
- Lecture 4: Gradient method
- Lecture 5: Fast gradient method
- Lecture 6: Lower complexity bounds
- Lecture 7: Subgradients
- Lecture 8: Subgradient method
- Lecture 9: Proximal mapping
- Lecture 10: Proximal gradient methods
- Lecture 11: Bregman gradient methods / mirror descent
- Lecture 12: Conjugate functions / Lagrange duality
- Lecture 13: Dual methods
- Lecture 14: ADMM / Douglas-Rachford
- Lecture 15: Newton's method
- Lecture 16: Newton's method (continued)

- Exercise sheet 1 - Example class Thu Feb 9th, 3.30-5pm, MR11
- Exercise sheet 2 - Example class Thu Feb 23rd, 3.30-5pm, MR11
- Exercise sheet 3 - Example class Thu Mar 9th, 3.30-5pm, MR11

- S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press.
- Yurii Nesterov, Introductory Lectures on Convex Optimization, Springer.
- Lieven Vandenberghe, Lecture slides for EE236C at UCLA.
- Sébastien Bubeck, Convex Optimization: Algorithms and Complexity
- G. Gordon and R. Tibshirani, Lecture notes for an optimization course at CMU
- N. Parikh and S. Boyd, Proximal algorithms. [A monograph about proximal operators and algorithms]
- S. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein, Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers . [A monograph about ADMM and its applications in distributed optimization and statistical learning]
- J. Renegar, A Mathematical View of Interior Point Methods for Convex Optimization. [A monograph about path-following interior-point methods using self-concordant functions]
- R.T. Rockafellar, Convex Analysis, Princeton Mathematical Series