CSE 535: Theory of Optimization and Continuous Algorithms

The design of algorithms is traditionally a discrete endeavor. However, many advances have come from a continuous viewpoint. Typically, a continuous process, deterministic or randomized is designed (or shown) to have desirable properties, such as approaching an optimal solution or a desired distribution, and an algorithm is derived from this by appropriate discretization. In interesting and general settings, the current fastest methods are a consequence of this perspective. We will discuss several examples of algorithms for high-dimensional optimization, and use them to understand the following concepts in detail.

  • Elimination
  • Reduction
  • Geometrization
  • Sparsification
  • Acceleration
  • Decomposition

Administrative Information:

  • Lectures: Tue, Thu 10:00-11:20 at Zoom
  • Text Book (in progress): https://github.com/YinTat/optimizationbook/blob/main/main.pdf.
  • Please do not make comment directly on the merged version, but instead the split versions below. Dropbox cannot handle large documents efficiently.
  • Prerequisite: basic knowledge of algorithms, probability, linear algebra.

Contacts:

  • Instructor: Yin Tat Lee
  • Office Hours: By appointment, email me at (yintat@ignoreme-uw.edu)
  • TA: Swati Padmanabhan (pswati@ignoreme-cs.washington.edu)
  • TA Office Hours: 1130 am - 1220 am Thu and 930 pm - 1020 pm Thu, zoom link on edstem.
  • Mailing List: Here
  • Edstem: Here

Assignments

  • Course evaluation: 5 assignments (100%) or 1 project (100%)
  • Assignments posted and submitted via Gradescope.

Class Policy

  • You may discuss assignments with others, but you must write down the solutions by yourself.
  • Assignments as well as regrade requests are to be submitted only via gradescope.
  • In general, late submissions are not accepted (the submission server closes at the deadline). Gradescope lets you overwrite previous submissions, so it is recommended to use this feature to avoid missing the submission deadline altogether.
  • We follow the standard UW CSE policy for academic integrity.
  • The office hours are for general questions about material from the lecture and homework problems.
  • Please refer to university policies regarding disability accommodations or religious accommodations.

Tentative Schedule:

Introduction

Basic Methods

  • Apr 06: Gradient Descent Sec 2 (Added Sec 2.6 on 7 Apr)
  • Apr 06: (HW 1 due)
  • Apr 08: Gradient Descent (cont) and Proximal Methods (See Sec 2 above)
  • Apr 13: Cutting Plane Methods Sec 3
  • Apr 15: Cutting Plane Methods (cont)

Geometrization

  • Apr 20: Mirror Descent (HW 2 due) Sec 5
  • Apr 22: Mirror Descent (cont)

Homotopy Method

  • Apr 27: Newton Method Sec 5.3
  • Apr 29: Interior Point Method Sec 5.4
  • May 04: Interior Point Method + Cholesky Decomposition Sec 5.4 rewrite (HW 3 due or project proposal due)

Linear System Solving

Acceleration

  • May 13: Stochastic Gradient Descent & Variance Reduction Sec 6.3, 6.4
  • May 18: Accelerated Gradient Descent (HW 4 due) Sec 7, Note

Others

  • May 20: Adagrad Note
  • May 25: Bayesian methods Note
  • May 27: Robust IPM Note

Project Presentations (Depending on Number of Projects)

  • Jun 01: Project
  • Jun 03: Project (HW 5 due)
  • Project report due on Jun 08

Other Lecture Notes