Optimization Online


Strong mixed-integer programming formulations for trained neural networks

Ross Anderson (rander***at***google.com)
Joey Huchette (jhuchette***at***google.com)
Will Ma (willma***at***google.com)
Christian Tjandraatmadja (ctjandra***at***google.com)
Juan Pablo Vielma (jvielma***at***mit.edu)

Abstract: We present strong mixed-integer programming (MIP) formulations for high-dimensional piecewise linear functions that correspond to trained neural networks. These formulations can be used for a number of important tasks, such as verifying that an image classification network is robust to adversarial inputs, or solving decision problems where the objective function is a machine learning model. We present a generic framework, which may be of independent interest, that provides a way to construct sharp or ideal formulations for the maximum of d affine functions over arbitrary polyhedral input domains. We apply this result to derive MIP formulations for a number of the most popular nonlinear operations (e.g. ReLU and max pooling) that are strictly stronger than other approaches from the literature. We corroborate this computationally, showing that our formulations are able to offer substantial improvements in solve time on verification tasks for image classification networks.

Keywords: Mixed-integer programming, Formulations, Deep Learning

Category 1: Integer Programming ((Mixed) Integer Linear Programming )

Category 2: Applications -- Science and Engineering (Statistics )


Download: [PDF]

Entry Submitted: 11/05/2018
Entry Accepted: 11/05/2018
Entry Last Modified: 06/03/2019

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society