The final homework of the semester may be downloaded here. It will be due on May 5th, 2015.
Homework 3
The third homework may be downloaded here. It will be due March 16th, 2015.
Update: The homework has been corrected. The new due date is Tuesday, April 21st.
Final Project
Your final project will involve writing a report on a paper of your choice from the list below. If you wish, you may partner up with one other person and write your report jointly. Your report will be due on the on the last day of class.
Your report should be about 8 pages long in a one-column, 11 point font, with reasonable margins. There should be nothing in your report that is a direct reproduction from the paper you choose. Rather, your report should represent your own independent understanding of the subject. You will, of course, reproduce the general arguments made in the paper, but these should not be line-by-line reproductions.
It is expected that you may have to do considerable background reading to understand the results in the paper you choose. All of the papers below have been published in the past couple of years, and it is likely that you’ll have to read a lot of previous work to write your report.
You report should:
1) Give a crisp, concise statement of the problem.
2) Discuss possible ways to solve this problem that are not the subject of your paper. Please take a look at the background literature.
3) Summarize your understanding of the key conceptual ideas in the paper without going into technical details.
4) Work out the main result of the paper on some simplified cases.
5) Give a high-level overview of the technical proof in the paper, focusing on the key points and omitting calculations.
Your report should contain more material than is simply in the paper, since it should represent a couple of months of thinking about the subject.
As soon as you have determined the paper you will choose, please send me an email; papers will be assigned to groups on a first-come basis.
- A Differential Equation Modeling Nesterov’s Accelerated Gradient Method: Theory and Insights, W. Su, S. Boyd, E. Candes.
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints, L. Lessard, B. Recht, A. Packard – TAKEN.
- Linear Coupling: an Ultimate Unification of Gradient and Mirror Descent, Z. Allen-Zhu, L. Orrechia – TAKEN
- EXTRA: An Exact, First-Order Algorithm for Decentralized Consensus Optimization, W. Shi, Q. Ling, G. Wu, W. Yin
- Convergent Subgradient Methods for Nonsmooth Convex Optimization, Y. Nesterov, V. Shikhman – TAKEN
- Making Gradient Descent Optimal for Strongly Convex Stochastic Optimization, A. Rakhlin, O. Shamir, K. Sridharan
- Primal Dual Subgradient Methods For Convex Problems, Y. Nesterov
- Dual Averaging for Distributed Optimization: Convergence Analysis and Network Scaling, J. Duchi, A. Agarwal, M. Wainwright – TAKEN
- A Linearly Convergent Conditional Gradient Algorithm With Applications to Online and Stochastic Optimization, D. Garber, E. Hazan.
- Stochastic Gradient Descent, Weighted Sampling, and the Randomized Kaczmarz Algorithm, D. Needell, N. Srebro, R. Ward – TAKEN
- Acceleration Methods for Convex Optimization Over the Fixed Point Set of a Nonexpansive Mapping, H. Iiduka
- Worst Case Complexity of Direct Search Under Convexity, M. Dodangeh, L. N. Vincente
- A conditional gradient method with linear rate of convergence for solving convex linear systems, A. Beck, M. Teboulle – TAKEN
- Smoothing and first-order methods: a unified framework, A. Beck, M. Teboulle
- On the Convergence of Alternating Minimization…, A. Beck- TAKEN.
- Random Gradient Free Minimization of Convex Functions, Y. Nesterov
- Smooth Strongly Convex Interpolation and Exact Worst-case Performance of First-Order Methods, A. B. Taylor, J. M. Hendrickx, F. Glineur
Homework 3
For Homework 3, please do exercises 9.1, 9.6, 9.10, 9.11, 11.2, 11.4 in your textbook. This will be due Thursday, April 2nd.
Homework 2 solutions
Solutions to Homework 2 may be downloaded here.
Homework 2
The second homework may be downloaded here. It will be due on March 3rd.
Homework 1 solutions
Solutions to homework 1 may be downloaded here.
Hints on Homework 1
1. Suppose x,r that achieves the minimum has the property that x is not an eigenvector. Can you get a contradiction from this? In particular, can you perturb x to get a smaller r?
Now, the hardest part of the problem is the last one, namely is to argue that the r that achieves the minimum is the magnitude of the largest eigenvalue in magnitude of A. Here is a trick which I think is useful here. First, show that the largest eigenvalue in magnitude of a positive matrix A with identical row sums s is actually s. Now given an (r,x) that achieve the optimum, can you use them to construct a matrix D such that D^{-1} A D is positive and has identical row sums?
4. You would like to show that two convex sets are identical. A typical strategy is to assume not, apply the separating hyperplane theorem, and obtain a contradiction.
5. The key here just to play around with the definition of a linear variety. You know that quantites like 2x – y or 3x-2y are contained it provided that x,y are contained in it. You can keep taking linear combinations of such things to get still different linear combinations that are contained in it.
Welcome to the website for IE 510! A catalog description of the course may be found here. A tentative syllabus for the course, subject to revision, may be found here. The first homework can be downloaded here; it will be due two weeks after the first lecture on Tuesday, February 3rd, 2015 Thursday, February 5th.Tuesday, Feb 10th.
Update 1/19/2015: I have added a fifth problem to the homework.
Update 1/23/2015: Fixed two issues on the homework. In problem 1, we need to to add the condition that x is a nonnegative, nonzero vector. In problem 3.b., I should have asked you to show the function is concave, not convex. Due to these issues, I have postponed the homework due date by a couple of days.
Update 1/26/2015: I made some minor changes. First, problem 4 needed the condition that the set C is closed. Secondly, I tinkered a little with problem 1 – r should have been e^(r’).
By the way, that there are several variations of the problem – you can maximize r subject to the condition Ax => rx, or minimize r subject to the condition Ax <= rx. Feel free to solve either one for full credit (but if in doubt just go with the one in the pdf of the homework)
Update 1/29/2015: I updated the homework – Problem 4 needs the additional assumption that C contains the origin in its interior.