In this course, we cover several frameworks for convex optimization, including, first-order methods, cutting plane methods, interior point methods and sampling methods. Besides covering some basic algorithms in those frameworks, we explain the geometry picture behind many of these algorithms.

### Administrative Information:

- Instructor: Yin Tat Lee
- Office Hours: By appointment, email me at yintat at uw dot edu.
- Lectures: Wednesday – Friday 3:00 – 4:20 at MGH 228
- Course evaluation: 2-3 homework (50%), final project (50%)

### Assignments:

- Assignments will be submitted via Canvas.

### Related Courses:

- Lap Chi Lau’s Course
- Aleksander Mądry’s Course
- Santosh Vempala’s Course
- (Please tell me if you know other related courses.)

### Tentative Schedule:

(Apparently, the schedule now is too aggressive.)

- Jan 03: Introduction

#### First Order Methods

- Jan 05: Gradient Descent & Mirror Descent
- Jan 10: Accelerated Gradient Descent
- Jan 12: Ellipsoid Method & Geometric Descent
- Jan 17: Preconditioning & Frank–Wolfe algorithm
- Jan 19: Maximum Flow Problem

#### Algorithms for Linear Systems

- Jan 24: Leverage Score & Uniform Sampling Method
- Jan 26: Sparse Cholesky Decomposition

#### Cutting Plane Methods

- Jan 31: Reductions Between Convex Problems
- Feb 02: Polytope Intersection
- Feb 07: Grumbaum’s Theorem & Center of Gravity Method
- Feb 09: John Ellipsoid & John Ellipsoid Method

#### Interior Point Methods

- Feb 14: Newton Method & Self-Concordant Barrier
- Feb 16: Log Barrier & Log Det Barrier
- Feb 21: Lee-Sidford Barrier

#### Convex Geometry

- Feb 23: Random Sampling in Convex Set
- Feb 28: Approximating Volume of Convex Set
- Mar 02: Localization Lemma
- Mar 07: Stochastic Localization
- Mar 09: KLS Conjecture