CSE 535: Theory of Optimization and Continuous Algorithms
The design of algorithms is traditionally a discrete endeavor. However, many advances have come from a continuous viewpoint. Typically, a continuous process, deterministic or randomized is designed (or shown) to have desirable properties, such as approaching an optimal solution or a desired distribution, and an algorithm is derived from this by appropriate discretization. In interesting and general settings, the current fastest methods are a consequence of this perspective. We will discuss several examples of algorithms for high-dimensional optimization, and use them to understand the following concepts in detail.
- Lectures: Tue, Thu 10:00-11:20 at Zoom
- Text Book (in progress): https://github.com/YinTat/optimizationbook/blob/main/main.pdf.
- Please do not make comment directly on the merged version, but instead the split versions below. Dropbox cannot handle large documents efficiently.
- Prerequisite: basic knowledge of algorithms, probability, linear algebra.
- Instructor: Yin Tat Lee
- Office Hours: By appointment, email me at (firstname.lastname@example.org)
- TA: Swati Padmanabhan (email@example.com)
- TA Office Hours: 1130 am - 1220 am Thu and 930 pm - 1020 pm Thu, zoom link on edstem.
- Mailing List: Here
- Edstem: Here
- Course evaluation: 5 assignments (100%) or 1 project (100%)
- Assignments posted and submitted via Gradescope.
- You may discuss assignments with others, but you must write down the solutions by yourself.
- Assignments as well as regrade requests are to be submitted only via gradescope.
- In general, late submissions are not accepted (the submission server closes at the deadline). Gradescope lets you overwrite previous submissions, so it is recommended to use this feature to avoid missing the submission deadline altogether.
- We follow the standard UW CSE policy for academic integrity.
- The office hours are for general questions about material from the lecture and homework problems.
- Please refer to university policies regarding disability accommodations or religious accommodations.
- Mar 30: Convexity Sec 1.1-1.5 (Updated on 31 Mar)
- Apr 01: Convexity and Derivative Sec 4.1-4.4 Reading
- Apr 06: Gradient Descent Sec 2 (Added Sec 2.6 on 7 Apr)
- Apr 06: (HW 1 due)
- Apr 08: Gradient Descent (cont) and Proximal Methods (See Sec 2 above)
- Apr 13: Cutting Plane Methods Sec 3
- Apr 15: Cutting Plane Methods (cont)
- Apr 20: Mirror Descent (HW 2 due) Sec 5
- Apr 22: Mirror Descent (cont)
- Apr 27: Newton Method Sec 5.3
- Apr 29: Interior Point Method Sec 5.4
- May 04: Interior Point Method + Cholesky Decomposition Sec 5.4 rewrite (HW 3 due or project proposal due)
Linear System Solving
- May 06: Leverage Score Sec 6.2
- May 11: Laplacian Solver Sec 8 in Tropp, Matrix Concentration & Computational Linear Algebra
- May 13: Stochastic Gradient Descent & Variance Reduction Sec 6.3, 6.4
- May 18: Accelerated Gradient Descent (HW 4 due) Sec 7, Note
Project Presentations (Depending on Number of Projects)
- Jun 01: Project
- Jun 03: Project (HW 5 due)
- Project report due on Jun 08
- Aaron Sidford, Introduction to Optimization Theory
- Lap Chi Lau, Convexity and Optimization
- Nisheeth Vishnoi, Algorithms for Convex Optimization
- Jonathan Kelner, Topics in Theoretical Computer Science: An Algorithmist’s Toolkit
- Santosh Vempala, Simple Algorithms
- Sheehan Olver, Numerical Complex Analysis