Partition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (2024)

Partition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (2)

Advanced Search

nips

research-article

Free Access

  • Authors:
  • Calvin Tsay Department of Computing, Imperial College London

    Department of Computing, Imperial College London

    Search about this author

    ,
  • Jan Kronqvist Department of Mathematics, KTH Royal Institute of Technology

    Department of Mathematics, KTH Royal Institute of Technology

    Search about this author

    ,
  • Alexander Thebelt Department of Computing, Imperial College London

    Department of Computing, Imperial College London

    Search about this author

    ,
  • Ruth Misener Department of Computing, Imperial College London

    Department of Computing, Imperial College London

    Search about this author

NIPS '21: Proceedings of the 35th International Conference on Neural Information Processing SystemsDecember 2021Article No.: 235Pages 3068–3080

Published:10 June 2024Publication History

  • Get Citation Alerts

    New Citation Alert added!

    This alert has been successfully added and will be sent to:

    You will be notified whenever a record that you have chosen has been cited.

    To manage your alert preferences, click on the button below.

    Manage my Alerts

    New Citation Alert!

    Please log in to your account

  • Publisher Site

NIPS '21: Proceedings of the 35th International Conference on Neural Information Processing Systems

Partition-based formulations for mixed-integer optimization of trained ReLU neural networks

Pages 3068–3080

PreviousChapterNextChapter

Partition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (3)

ABSTRACT

This paper introduces a class of mixed-integer formulations for trained ReLU neural networks. The approach balances model size and tightness by partitioning node inputs into a number of groups and forming the convex hull over the partitions via disjunctive programming. At one extreme, one partition per input recovers the convex hull of a node, i.e., the tightest possible formulation for each node. For fewer partitions, we develop smaller relaxations that approximate the convex hull, and show that they outperform existing formulations. Specifically, we propose strategies for partitioning variables based on theoretical motivations and validate these strategies using extensive computational experiments. Furthermore, the proposed scheme complements known algorithmic approaches, e.g., optimization-based bound tightening captures dependencies within a partition.

Skip Supplemental Material Section

Supplemental Material

Available for Download

pdf

3540261.3540496_supp.pdf (833.5 KB)

Supplemental material.

References

  1. Tobias Achterberg and Roland Wunderling. Mixed integer programming: Analyzing 12 years of progress. In Facets of Combinatorial Optimization, pages 449–481. Springer, 2013.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (4)
  2. Michael Akintunde, Alessio Lomuscio, Lalit Maganti, and Edoardo Pirovano. Reachability analysis for neural agent-environment systems. In International Conference on Principles of Knowledge Representation and Reasoning., pages 184–193, 2018.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (5)
  3. Ross Anderson, Joey Huchette, Will Ma, Christian Tjandraatmadja, and Juan Pablo Vielma. Strong mixed-integer programming formulations for trained neural networks. Mathematical Programming, pages 1–37, 2020.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (6)
  4. Egon Balas. Disjunctive Programming. Springer International Publishing, 2018. .Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (7)Cross Ref
  5. Elena Botoeva, Panagiotis Kouvaros, Jan Kronqvist, Alessio Lomuscio, and Ruth Misener. Efficient verification of ReLU-based neural networks via dependency analysis. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, pages 3291–3299, 2020.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (9)Cross Ref
  6. Rudy Bunel, Ilker Turkaslan, Philip HS Torr, Pushmeet Kohli, and M Pawan Kumar. A unified view of piecewise linear neural network verification. arXiv preprint arXiv:1711.00455, 2017.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (11)
  7. Rudy Bunel, Alessandro De Palma, Alban Desmaison, Krishnamurthy Dvijotham, Pushmeet Kohli, Philip Torr, and M Pawan Kumar. Lagrangian decomposition for neural network verification. In Conference on Uncertainty in Artificial Intelligence, pages 370–379. PMLR, 2020.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (12)
  8. Pin-Yu Chen, Yash Sharma, Huan Zhang, Jinfeng Yi, and Cho-Jui Hsieh. Ead: elastic-net attacks to deep neural networks via adversarial examples. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 32, 2018.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (13)Cross Ref
  9. Chih-Hong Cheng, Georg Nührenberg, and Harald Ruess. Maximum resilience of artificial neural networks. In International Symposium on Automated Technology for Verification and Analysis, pages 251–268. Springer, 2017.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (15)
  10. Michele Conforti, Gérard Cornuéjols, and Giacomo Zambelli. Integer Programming, volume 271 of Graduate Texts in Mathematics, 2014.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (16)Cross Ref
  11. Francesco Croce and Matthias Hein. Minimally distorted adversarial examples with a fast adaptive boundary attack. In International Conference on Machine Learning, pages 2196–2205. PMLR, 2020.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (18)
  12. Sumanth Dathathri, Krishnamurthy Dvijotham, Alexey Kurakin, Aditi Raghunathan, Jonathan Uesato, Rudy Bunel, Shreya Shankar, Jacob Steinhardt, Ian Goodfellow, Percy Liang, et al. Enabling certification of verification-agnostic networks via memory-efficient semidefinite programming. arXiv preprint arXiv:2010.11645, 2020.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (19)
  13. Alessandro De Palma, Harkirat Singh Behl, Rudy Bunel, Philip HS Torr, and M Pawan Kumar. Scaling the convex barrier with sparse dual algorithms. In International Conference on Learning Representations, 2021.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (20)
  14. Souradeep Dutta, Susmit Jha, Sriram Sankaranarayanan, and Ashish Tiwari. Output range analysis for deep feedforward neural networks. In NASA Formal Methods Symposium, pages 121–138. Springer, 2018.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (21)
  15. Krishnamurthy Dvijotham, Robert Stanforth, Sven Gowal, Timothy A Mann, and Pushmeet Kohli. A dual approach to scalable verification of deep networks. In UAI, volume 1, page 3, 2018.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (22)
  16. Ruediger Ehlers. Formal verification of piece-wise linear feed-forward neural networks. In International Symposium on Automated Technology for Verification and Analysis, pages 269–286. Springer, 2017.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (23)
  17. Matteo Fischetti and Jason Jo. Deep neural networks and mixed integer linear optimization. Constraints, 23(3):296–309, 2018.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (24)Digital Library
  18. Bjarne Grimstad and Henrik Andersson. ReLU networks as surrogate models in mixed-integer linear programs. Computers & Chemical Engineering, 131:106580, 2019.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (26)Cross Ref
  19. Gurobi Optimization, LLC. Gurobi optimizer reference manual, 2020. URL http://www.gurobi.com.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (28)
  20. Alex Krizhevsky. Learning multiple layers of features from tiny images. Master's thesis, The University of Toronto, 2009.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (29)
  21. Jan Kronqvist, Ruth Misener, and Calvin Tsay. Between steps: Intermediate relaxations between big-M and convex hull formulations. In International Conference on Integration of Constraint Programming, Artificial Intelligence, and Operations Research, pages 299–314. Springer, 2021.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (30)
  22. Yann LeCun, Corinna Cortes, and CJ Burges. MNIST handwritten digit database. ATT Labs [Online]. Available: http://yann.lecun.com/exdb/mnist, 2, 2010.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (31)
  23. Alessio Lomuscio and Lalit Maganti. An approach to reachability analysis for feed-forward ReLU neural networks. arXiv preprint arXiv:1706.07351, 2017.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (32)
  24. Adam Paszke, Sam Gross, Francisco Massa, Adam Lerer, James Bradbury, Gregory Chanan, Trevor Killeen, Zeming Lin, Natalia Gimelshein, Luca Antiga, Alban Desmaison, Andreas Kopf, Edward Yang, Zachary DeVito, Martin Raison, Alykhan Tejani, Sasank Chilamkurthy, Benoit Steiner, Lu Fang, Junjie Bai, and Soumith Chintala. Pytorch: An imperative style, high-performance deep learning library. In Advances in Neural Information Processing Systems, pages 8024–8035. 2019.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (33)
  25. Aditi Raghunathan, Jacob Steinhardt, and Percy S Liang. Semidefinite relaxations for certifying robustness to adversarial examples. In Advances in Neural Information Processing Systems, volume 31, pages 10877–10887, 2018.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (34)
  26. Ted Ralphs, Yuji Shinano, Timo Berthold, and Thorsten Koch. Parallel solvers for mixed integer linear optimization. In Handbook of Parallel Constraint Reasoning, pages 283–336. Springer, 2018.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (35)
  27. Thiago Serra, Christian Tjandraatmadja, and Srikumar Ramalingam. Bounding and counting linear regions of deep neural networks. In International Conference on Machine Learning, pages 4558–4566. PMLR, 2018.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (36)
  28. Thiago Serra, Abhinav Kumar, and Srikumar Ramalingam. Lossless compression of deep neural networks. In Integration of Constraint Programming, Artificial Intelligence, and Operations Research, pages 417–430. Springer, 2020.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (37)
  29. Gagandeep Singh, Rupanshu Ganvir, Markus Püschel, and Martin Vechev. Beyond the single neuron convex barrier for neural network certification. In Advances in Neural Information Processing Systems, pages 15098–15109, 2019.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (38)
  30. Gagandeep Singh, Timon Gehr, Markus Püschel, and Martin T Vechev. Boosting robustness certification of neural networks. In International Conference on Learning Representations, 2019.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (39)
  31. Christian Tjandraatmadja, Ross Anderson, Joey Huchette, Will Ma, KRUNAL KISHOR PATEL, and Juan Pablo Vielma. The convex relaxation barrier, revisited: Tightened single-neuron relaxations for neural network verification. In H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H. Lin, editors, Advances in Neural Information Processing Systems, volume 33, pages 21675–21686, 2020.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (40)
  32. Vincent Tjeng, Kai Xiao, and Russ Tedrake. Evaluating robustness of neural networks with mixed integer programming. arXiv preprint arXiv:1711.07356, 2017.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (41)
  33. Michael Volpp, Lukas P Fröhlich, Kirsten Fischer, Andreas Doerr, Stefan Falkner, Frank Hutter, and Christian Daniel. Meta-learning acquisition functions for transfer learning in bayesian optimization. In International Conference on Learning Representations, 2019.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (42)
  34. Lily Weng, Huan Zhang, Hongge Chen, Zhao Song, Cho-Jui Hsieh, Luca Daniel, Duane Boning, and Inderjit Dhillon. Towards fast computation of certified robustness for ReLU networks. In International Conference on Machine Learning, pages 5276–5285. PMLR, 2018.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (43)
  35. Eric Wong and Zico Kolter. Provable defenses against adversarial examples via the convex outer adversarial polytope. In International Conference on Machine Learning, pages 5286–5295. PMLR, 2018.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (44)
  36. Ga Wu, Buser Say, and Scott Sanner. Scalable planning with deep neural network learned transition models. Journal of Artificial Intelligence Research, 68:571–606, 2020.Google ScholarPartition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (45)Cross Ref

Cited By

View all

Partition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (47)

    Recommendations

    • Compact mixed-integer programming formulations in quadratic optimization

      Abstract

      We present a technique for producing valid dual bounds for nonconvex quadratic optimization problems. The approach leverages an elegant piecewise linear approximation for univariate quadratic functions due to Yarotsky (Neural Netw 94:103–114, 2017)...

      Read More

    • Strong Mixed-Integer Programming Formulations for Trained Neural Networks

      Integer Programming and Combinatorial Optimization

      Abstract

      We present an ideal mixed-integer programming (MIP) formulation for a rectified linear unit (ReLU) appearing in a trained neural network. Our formulation requires a single binary variable and no additional continuous variables beyond the input and ...

      Read More

    • Solving Mixed Integer Bilinear Problems Using MILP Formulations

      In this paper, we examine a mixed integer linear programming reformulation for mixedinteger bilinear problems where each bilinearterm involves the product of a nonnegativeinteger variable and a nonnegative continuous variable. This reformulation is ...

      Read More

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    Get this Publication

    • Information
    • Contributors
    • Published in

      Partition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (48)

      NIPS '21: Proceedings of the 35th International Conference on Neural Information Processing Systems

      December 2021

      30517 pages

      ISBN:9781713845393

      • Editors:
      • M. Ranzato,
      • A. Beygelzimer,
      • Y. Dauphin,
      • P.S. Liang,
      • J. Wortman Vaughan

      Copyright © 2021 Neural Information Processing Systems Foundation, Inc.

      Sponsors

        In-Cooperation

          Publisher

          Curran Associates Inc.

          Red Hook, NY, United States

          Publication History

          • Published: 10 June 2024

          Qualifiers

          • research-article
          • Research
          • Refereed limited

          Conference

          Funding Sources

          • Partition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (49)

            Other Metrics

            View Article Metrics

          • Bibliometrics
          • Citations0
          • Article Metrics

            • Total Citations

              View Citations
            • Total Downloads

            • Downloads (Last 12 months)0
            • Downloads (Last 6 weeks)0

            Other Metrics

            View Author Metrics

          • Cited By

            This publication has not been cited yet

          Digital Edition

          View this article in digital edition.

          View Digital Edition

          • Figures
          • Other

            Close Figure Viewer

            Browse AllReturn

            Caption

            View Table of Contents

            Export Citations

              Your Search Results Download Request

              We are preparing your search results for download ...

              We will inform you here when the file is ready.

              Download now!

              Your Search Results Download Request

              Your file of search results citations is now ready.

              Download now!

              Your Search Results Download Request

              Your search export query has expired. Please try again.

              Partition-based formulations for mixed-integer optimization of trained ReLU neural networks | Proceedings of the 35th International Conference on Neural Information Processing Systems (2024)
              Top Articles
              Latest Posts
              Article information

              Author: Nathanael Baumbach

              Last Updated:

              Views: 5375

              Rating: 4.4 / 5 (75 voted)

              Reviews: 90% of readers found this page helpful

              Author information

              Name: Nathanael Baumbach

              Birthday: 1998-12-02

              Address: Apt. 829 751 Glover View, West Orlando, IN 22436

              Phone: +901025288581

              Job: Internal IT Coordinator

              Hobby: Gunsmithing, Motor sports, Flying, Skiing, Hooping, Lego building, Ice skating

              Introduction: My name is Nathanael Baumbach, I am a fantastic, nice, victorious, brave, healthy, cute, glorious person who loves writing and wants to share my knowledge and understanding with you.