Brandon Amos

Differentiable Optimization-Based Modeling for Machine Learning Degree Type: Ph.D. in Computer Science
Advisor(s): Zico Kolter
Graduated: May 2019

Abstract:

Domain-specific modeling priors and specialized components are becoming increasingly important to the machine learning field. These components integrate specialized knowledge that we have as humans into model. We argue in this thesis that optimization methods provide an expressive set of operations that should be part of the machine learning practitioner's modeling toolbox.

We present two foundational approaches for optimization-based modeling: 1) the OptNet architecture that integrates optimization problems as individual layers in larger end-to-end trainable deep networks, and 2) the input-convex neural network (ICNN) architecture that helps make inference and learning in deep energy-based models and structured prediction more tractable.

We then show how to use the OptNet approach 1) as a way of combining model-free and model-based reinforcement learning and 2) for top-@i learning problems. We conclude by showing how to differentiate cone programs and turn the cvxpy domain specific language into a differentiable optimization layer that enables rapid prototyping of the approaches in this thesis.

The source code for this thesis document is available in open source form.

Thesis Committee:
J. Zico Kolter (Chair)
Barnabás Póczos
Jeff Schneider
Vladlen Koltun (Intel Labs)

Srinivasan Seshan, Head, Computer Science Department
Tom M. Mitchell, Interim Dean, School of Computer Science

Keywords:
Machine learning, statistical modeling, convex optimization, deep learning, control, reinforcement learning

CMU-CS-19-109.pdf (3.29 MB) ( 147 pages)
Copyright Notice