Tue-Thu 10AM - MR13

Revision session will be on Tuesday May 12th, 3:30-5:30pm in MR9.

The following lecture notes are rough and were not proofread, so they may contain mistakes, typos, ...

- Lecture 1: Introduction
- Lecture 2: Review of convex functions
- Lecture 3: Gradient method
- Lecture 4: Lower complexity bounds
- Lecture 5: Fast gradient method
- Lecture 6: Proximal gradient method
- Lecture 7: Subgradient method
- Lecture 8: Conjugate functions
- Lecture 9: Smoothing
- Lecture 10: Lagrangian duality
- Lecture 11: Dual methods
- Lecture 12: Mirror descent
- Lecture 13: Newton's method
- Lecture 14: Self-concordant functions
- Lecture 15: Path-following methods
- Lecture 16: Linear/second-order cone/semidefinite programming

Note on subdifferential of sum of two convex functions.

- Exercise sheet 1. To be discussed at the first example class, Tuesday 5th November 3:30pm, MR9. Solutions
- Exercise sheet 2. To be discussed at the second example class, Tuesday 19th November 3:30pm, MR9. Solutions
- Exercise sheet 3. To be discussed at the third example class, Tuesday 3rd December 3:30pm, MR9. Solutions

- S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press.
- Yurii Nesterov, Introductory Lectures on Convex Optimization, Springer.
- Lieven Vandenberghe, Lecture slides for EE236C at UCLA.
- Sébastien Bubeck, Convex Optimization: Algorithms and Complexity
- G. Gordon and R. Tibshirani, Lecture notes for an optimization course at CMU
- N. Parikh and S. Boyd, Proximal algorithms. [A monograph about proximal operators and algorithms]
- J. Renegar, A Mathematical View of Interior Point Methods for Convex Optimization. [A monograph about path-following interior-point methods using self-concordant functions]