CSE 535: Theory of Optimization and Continuous Algorithms
The design of algorithms is traditionally a discrete endeavor. However, many advances have come from a continuous viewpoint. Typically, a continuous process, deterministic or randomized is designed (or shown) to have desirable properties, such as approaching an optimal solution or a desired distribution, and an algorithm is derived from this by appropriate discretization. In interesting and general settings, the current fastest methods are a consequence of this perspective. We will discuss several examples of algorithms for high-dimensional optimization, and use them to understand the following concepts in detail.
- Elimination
- Reduction
- Geometrization
- Sparsification
- Acceleration
- Decomposition
This course is offered in Georgia Tech at the same time by Santosh Vempala.
Administrative Information:
- Lectures: Tue, Thu 11:30-12:50 at CSE2 G01
- Recitation: Mon 3:30 - 4:30 at CSE2 271
- Instructor: Yin Tat Lee (yintat@ uw.edu)
- Instructor Office Hours: Tue 3:00-4:00 at CSE 562.
- I will also answer questions in Piazza
- TA: Swati Padmanabhan (pswati@ cs.washington.edu)
- TA Office hours: Thu 1:30-2:30 at CSE 5th floor breakout
- Text Book (in progress): https://github.com/YinTat/optimizationbook/blob/main/main.pdf
- Course evaluation: 4 assignments (100%)
- Prerequisite: basic knowledge of algorithms, probability, linear algebra.
Assignments
The assignments are postsed in Piazza. Submitted via Gradescope.
Class Policy
- You may discuss assignments with others, but you must write down the solutions by yourself.
- Assignments as well as regrade requests are to be submitted only via gradescope.
- In general, late submissions are not accepted (the submission server closes at the deadline). Gradescope lets you overwrite previous submissions, so it is recommended to use this feature to avoid missing the submission deadline altogether.
- We follow the standard UW CSE policy for academic integrity.
- The recitation session is optional. In these sessions, Swati will review concepts from lecture, go over details (for example, of complicated calculations) that may have been unclear/too fast during lecture.
- The office hours are for general questions about material from the lecture and homework problems.
- Please refer to university policies regarding disability accommodations or religious accommodations.
Tentative Schedule:
Introduction
- Jan 07: Introduction (1.1, 1.2, 1.3, 1.4)
- Jan 09: Introduction (1.5, Appendix)
Basic Methods
- Jan 14: Gradient Descent (2.1)
- Jan 16: Cutting Plane Methods (2.3, 3.1, 3.2)
- Jan 21: Cutting Plane Methods (3.3, 3.4, 3.7)
Reduction
- Jan 23: Equivalence (4.1)
- Jan 28: Duality (4.2)
- Jan 30: Duality (4.4)
Geometrization
- Feb 04: Mirror Descent
- Feb 06: Frank-Wolfe
- Feb 11: Newton Method & L-BFG
- Feb 13: Interior Point Method
- Feb 18: Interior Point Method
Sparsification
- Feb 20: Subspace Embedding
- Feb 25: Leverage Score Sampling
- Feb 27: Coordinate Descent
- Mar 3: Stochastic Gradient Descent & Variance Reduction
Acceleration
- Mar 5: Conjugate Gradient & Chebyshev Expansion
- Mar 10: Accelerated Gradient Descent
Related Courses:
- Aaron Sidford, Introduction to Optimization Theory
- Lap Chi Lau, Convexity and Optimization
- Nisheeth Vishnoi, Algorithms for Convex Optimization
- Jonathan Kelner, Topics in Theoretical Computer Science: An Algorithmist’s Toolkit
- Santosh Vempala, Simple Algorithms
- Sheehan Olver, Numerical Complex Analysis