Your final project will involve writing a report on a paper of your choice from the list below. If you wish, you may partner up with one other person and write your report jointly. Your report will be due on the on the last day of class.
Your report should be about 8 pages long in a one-column, 11 point font, with reasonable margins. There should be nothing in your report that is a direct reproduction from the paper you choose. Rather, your report should represent your own independent understanding of the subject. You will, of course, reproduce the general arguments made in the paper, but these should not be line-by-line reproductions.
It is expected that you may have to do considerable background reading to understand the results in the paper you choose. All of the papers below have been published in the past couple of years, and it is likely that you’ll have to read a lot of previous work to write your report.
You report should:
1) Give a crisp, concise statement of the problem.
2) Discuss possible ways to solve this problem that are not the subject of your paper. Please take a look at the background literature.
3) Summarize your understanding of the key conceptual ideas in the paper without going into technical details.
4) Work out the main result of the paper on some simplified cases.
5) Give a high-level overview of the technical proof in the paper, focusing on the key points and omitting calculations.
Your report should contain more material than is simply in the paper, since it should represent a couple of months of thinking about the subject.
As soon as you have determined the paper you will choose, please send me an email; papers will be assigned to groups on a first-come basis.
- A Differential Equation Modeling Nesterov’s Accelerated Gradient Method: Theory and Insights, W. Su, S. Boyd, E. Candes.
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints, L. Lessard, B. Recht, A. Packard – TAKEN.
- Linear Coupling: an Ultimate Unification of Gradient and Mirror Descent, Z. Allen-Zhu, L. Orrechia – TAKEN
- EXTRA: An Exact, First-Order Algorithm for Decentralized Consensus Optimization, W. Shi, Q. Ling, G. Wu, W. Yin
- Convergent Subgradient Methods for Nonsmooth Convex Optimization, Y. Nesterov, V. Shikhman – TAKEN
- Making Gradient Descent Optimal for Strongly Convex Stochastic Optimization, A. Rakhlin, O. Shamir, K. Sridharan
- Primal Dual Subgradient Methods For Convex Problems, Y. Nesterov
- Dual Averaging for Distributed Optimization: Convergence Analysis and Network Scaling, J. Duchi, A. Agarwal, M. Wainwright – TAKEN
- A Linearly Convergent Conditional Gradient Algorithm With Applications to Online and Stochastic Optimization, D. Garber, E. Hazan.
- Stochastic Gradient Descent, Weighted Sampling, and the Randomized Kaczmarz Algorithm, D. Needell, N. Srebro, R. Ward – TAKEN
- Acceleration Methods for Convex Optimization Over the Fixed Point Set of a Nonexpansive Mapping, H. Iiduka
- Worst Case Complexity of Direct Search Under Convexity, M. Dodangeh, L. N. Vincente
- A conditional gradient method with linear rate of convergence for solving convex linear systems, A. Beck, M. Teboulle – TAKEN
- Smoothing and first-order methods: a unified framework, A. Beck, M. Teboulle
- On the Convergence of Alternating Minimization…, A. Beck- TAKEN.
- Random Gradient Free Minimization of Convex Functions, Y. Nesterov
- Smooth Strongly Convex Interpolation and Exact Worst-case Performance of First-Order Methods, A. B. Taylor, J. M. Hendrickx, F. Glineur