Instructor: Akshay Ramachandran
Term: Winter II, 2025
We will cover the fundamental algorithms used for efficient convex optimization, with a focus on rigorous convergence analysis. At the end, you should have an understanding of why convex optimization algorithms work, and be able to effectively apply these tools to your own research. This course will be different from and complementary to CPSC 536M taught by Michael Friedlander in Term 1.
MW 12:30-2pm in DMP 201
| Date | Topic | Notes |
|---|---|---|
| Week 1 | Introduction | - |
| Week 2-3 | Convex sets and functions | Notes (prelim) |
| Week 3 | Cutting Plane Methods | Notes (prelim) |
| Week 4 | Convex Programming Duality and John's Ellipsoid | Notes (prelim) |
| Week 5-6 | Gradient Descent | Notes (prelim) |
| February 9 | Student Lecture: Kevin K Thomas - Stochastic Gradient Descent | Notes (Scribe: Arqam Patel) |
| Week 7 | Reading Week | — |
| February 23 | Guest Lecture: Chen Greif - Conjugate Gradient | Notes (Scribe: Kevin K Thomas) |
| Week 8-9 | Mirror Descent | Notes (prelim) |
| March 4 | Student Lecture: Inzaghi Moniaga, Yin Huang - Multiplicative Weights Method | Notes (Scribe: Tong Ling) |
| March 11 | Student Lecture: Trevor Tidy - Accelerated Gradient Descent | Notes (Scribe: Hasti Karimi) |
| March 18 | Student Lecture: Ying Qi Wen - Spectral Descent | Notes (Scribe: Yin Huang, Inzaghi Moniaga) |
| Week 10-12 | Interior Point Methods | Notes (very rough) |