Schedule - Spring 2018

Week 1

Mon 01/22 Introduction: Basic terminology, examples: linear regression, LASSO, sparse regression [pdf] [HTML] [IJulia] [data]
Wed 01/24 Elements of convex geometry: Convex sets, convex functions, separating hyperplanes, linear classification, perceptron, neural networks [pdf (S16)]
Fri 01/26 Section 1 [pdf (S16)]

Week 2

Mon 01/29 Linear programming: Definition, canonical and standard form, modelization examples [pdf (S16)] [HTML] [IJulia] [data]
Wed 01/31 Duality in linear programming: Dual programs, weak duality, strong duality, complementary slackness, Farkas lemma. We largely followed lecture notes by S. Sachdeva.
Fri 02/02 Section 2 [pdf (S16)]

Week 3

Mon 02/05 Applications of duality: Game theory, Nash equilibrium, minimax theorem [pdf (S16)]
Wed 02/07 Simplex algorithm: The simplex algoritm, extreme points. We mostly followed lecture notes by J. Nelson.
Fri 02/09 Section 3 [pdf]

Week 4

Mon 02/12 Simplex continued: Simplex algorithm review, why is the point maintained by the simplex a vertex? (e.g. see Theorem 3.2 in Bertsimas and Tsitsiklis).
Characterizations of convex functions: first order Taylor expansions, linear lower bounds to convex functions. [pdf (S16)]
Wed 02/14 Characterizations of convex functions (continued): global and local minima, second order Taylor expansions, Hessian characterization of convex functions. [pdf (S16)]

Week 5

Mon 02/12 President's Day, no class.
Tue 02/20 Section 4 [pdf]
Wed 02/21 Gradient descent: algorithm and convergence analysis for strongly convex functions. [pdf (S16)]
Fri 02/23 Section 5 [pdf (S16)]

Week 6

Mon 02/26 Steepest descent and Newton's method: gradient descent w.r.t. other norm [pdf (S16)]
Wed 02/28 Learning from experts: Weighted-majority / multiplicative weights algorithm. Lower bounds for deterministic expert learning algorithms. Randomized weighted majority algorithm. [pdf (S16)]
Fri 03/02 Section 6 [pdf (S16)]

Week 7

Mon 03/05 Duality: Lagrangian duality, Slater's condition [pdf (S16)]
Wed 03/07 Application of Lagrangian duality: Support Vector Machines [pdf (S16)]
Fri 03/09 Section 7 [pdf (S16)]

Week 8

Spring break.

Week 9

Mon 03/19 Duality: Lagrangian duality review, duality gap, KKT conditions. [pdf (S16)]
Wed 03/21 Lower bounds for convex optimization: Impossibility result for zero-order convex optimization with small errors. [pdf (S16)]
Fri 03/23 Section 8 [pdf (S16)]

Week 10

Mon 03/25 Knapsack: Approximation algorithms, the greedy algorithm, LP relaxation, dynamic programming, FPTAS [pdf (S16)]
Wed 03/27 (Guest lecture by Eric Balkanski) Discrete optimization: Greedy algorithms for minimum spanning tree, maximum weight matching. Introduction to submodular functions. [no notes]
Fri 03/29 Section 9 [pdf (S16)]

Week 11

Mon 04/02 Submodular maximization: Maximum coverage problem, submodular functions, greedy algorithm for submodular maximization, set cover problem. [pdf (S16)]
Wed 04/04 Approximation Algorithms: Set Cover problem, submodular threshold problem, greedy algorithm for Set Cover and submodular threshold problems, randomized rounding for Set Cover, deterministic rounding for sparse Set Cover. [pdf (S16)]
Fri 04/07 Section 10 [pdf (S16)]

Week 12

Mon 04/09 Max Cut and Introduction to Semi-Definite Programming: Maximum Cut problem, local search approximation algorithm, random cut algorithm, introduction to semi-definite programming and linear algebra review. [pdf]
Wed 04/11 Max Cut through Semi-Definite Programming: Maximum Cut problem, SDP relaxation for Max Cut, hyperplane rounding algorithm. Balanced cut, relaxation strengthening. We followed lecture notes by S. Sachdeva. (part 1) and lecture notes by S. Sachdeva. (part 2)

Week 13

Mon 04/16 Convex Relaxations for l0 regression: NP-hardness for l0 regression, l1 relaxation, exact recovery under restricted nullspace property, restricted isometry property implies restricted nullspace property. Lecture notes by L. Schmidt and M. Hardt
Wed 04/18 Non-convex methods for l0 regression: Iterative refinement / projected gradient descent / iterative hard thresholding for l0 regression. Lecture notes by L. Schmidt and M. Hardt

Week 14

Mon 04/23 Projected Gradient Descent: Review Gradient Descent convergence using two different potential function arguments. Projected Gradient Descent analyzed with potential functions. We used notes by Notes by N. Bansal and A. Gupta.
Wed 04/25 Accelerated Gradient Descent and Mirror Descent: Accelerated Gradient Descent analyzed using potential function argument; Mirror Descent, Fenchel Conjugates, Bregman Divergences, Online Convex Optimization and Learning from Experts. We used notes by Notes by N. Bansal and A. Gupta.