CSE 535: Theory of Optimization and Continuous Algorithms
The design of algorithms is traditionally a discrete endeavor. However, many advances have come from a continuous viewpoint. Typically, a continuous process, deterministic or randomized is designed (or shown) to have desirable properties, such as approaching an optimal solution or a desired distribution, and an algorithm is derived from this by appropriate discretization. In interesting and general settings, the current fastest methods are a consequence of this perspective. We will discuss several examples of algorithms for high-dimensional optimization, and use them to understand the following concepts in detail.
- Lectures: Tue, Thu 10:00-11:20 at Zoom
- Text Book (in progress): https://www.dropbox.com/s/9ck2p07mibvwrmg/main.pdf.
- Please do not make comment directly on the merged version, but instead the split versions below. Dropbox cannot handle large documents efficiently.
- Prerequisite: basic knowledge of algorithms, probability, linear algebra.
- Instructor: Yin Tat Lee
- Office Hours: By appointment, email me at (email@example.com)
- TA: Swati Padmanabhan (firstname.lastname@example.org)
- TA Office Hours: 1130 am - 1220 am Thu and 930 pm - 1020 pm Thu, zoom link on edstem.
- Mailing List: Here
- Edstem: Here
- Course evaluation: 5 assignments (100%) or 1 project (100%)
- Assignments posted and submitted via Gradescope.
- You may discuss assignments with others, but you must write down the solutions by yourself.
- Assignments as well as regrade requests are to be submitted only via gradescope.
- In general, late submissions are not accepted (the submission server closes at the deadline). Gradescope lets you overwrite previous submissions, so it is recommended to use this feature to avoid missing the submission deadline altogether.
- We follow the standard UW CSE policy for academic integrity.
- The office hours are for general questions about material from the lecture and homework problems.
- Please refer to university policies regarding disability accommodations or religious accommodations.
- Mar 30: Convexity Sec 1.1-1.5 (Updated on 31 Mar)
- Apr 01: Convexity and Derivative Sec 4.1-4.4 Reading
- Apr 06: Gradient Descent Sec 2 (Added Sec 2.6 on 7 Apr)
- Apr 06: (HW 1 due)
- Apr 08: Gradient Descent (cont) and Proximal Methods (See Sec 2 above)
- Apr 13: Cutting Plane Methods Sec 3
- Apr 15: Cutting Plane Methods (cont)
- Apr 20: Mirror Descent (HW 2 due)
- Apr 22: Newton Method & L-BFGS
- Apr 27: Interior Point Method
- Apr 29: Robust Interior Point Method
- May 04: Leverage Score Sampling (HW 3 due or project proposal due)
- May 06: Stochastic Gradient Descent & Variance Reduction
- May 11: Cholesky Decomposition
- May 13: Laplacian Solver
- May 18: Conjugate Gradient & Chebyshev Expansion (HW 4 due)
- May 20: Accelerated Gradient Descent
- May 25: Adagrad
- May 27: Bayesian methods
Project Presentations (Depending on Number of Projects)
- Jun 01: Project (HW 5 due)
Jun 03: Project
- Project report due on Jun 08
- Aaron Sidford, Introduction to Optimization Theory
- Lap Chi Lau, Convexity and Optimization
- Nisheeth Vishnoi, Algorithms for Convex Optimization
- Jonathan Kelner, Topics in Theoretical Computer Science: An Algorithmist’s Toolkit
- Santosh Vempala, Simple Algorithms
- Sheehan Olver, Numerical Complex Analysis