DeepSplit: Scalable Verification of Deep Neural Networks via Operator Splitting

Analyzing the worst-case performance of deep neural networks against input perturbations amounts to solving a large-scale non-convex optimization problem, for which several past works have proposed convex relaxations as a promising alternative. However, even for reasonably-sized neural networks, the...

Full description

Bibliographic Details
Main Authors: Shaoru Chen, Eric Wong, J. Zico Kolter, Mahyar Fazlyab
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Open Journal of Control Systems
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9811356/
_version_ 1797797555459850240
author Shaoru Chen
Eric Wong
J. Zico Kolter
Mahyar Fazlyab
author_facet Shaoru Chen
Eric Wong
J. Zico Kolter
Mahyar Fazlyab
author_sort Shaoru Chen
collection DOAJ
description Analyzing the worst-case performance of deep neural networks against input perturbations amounts to solving a large-scale non-convex optimization problem, for which several past works have proposed convex relaxations as a promising alternative. However, even for reasonably-sized neural networks, these relaxations are not tractable, and so must be replaced by even weaker relaxations in practice. In this work, we propose a novel operator splitting method that can directly solve a convex relaxation of the problem to high accuracy, by splitting it into smaller sub-problems that often have analytical solutions. The method is modular, scales to very large problem instances, and compromises of operations that are amenable to fast parallelization with GPU acceleration. We demonstrate our method in bounding the worst-case performance of large convolutional networks in image classification and reinforcement learning settings, and in reachability analysis of neural network dynamical systems.
first_indexed 2024-03-13T03:50:06Z
format Article
id doaj.art-b3ba3c31b7f74537b58c652cd13bc45c
institution Directory Open Access Journal
issn 2694-085X
language English
last_indexed 2024-03-13T03:50:06Z
publishDate 2022-01-01
publisher IEEE
record_format Article
series IEEE Open Journal of Control Systems
spelling doaj.art-b3ba3c31b7f74537b58c652cd13bc45c2023-06-22T16:06:37ZengIEEEIEEE Open Journal of Control Systems2694-085X2022-01-01112614010.1109/OJCSYS.2022.31874299811356DeepSplit: Scalable Verification of Deep Neural Networks via Operator SplittingShaoru Chen0https://orcid.org/0000-0001-9416-0627Eric Wong1https://orcid.org/0000-0002-8568-6659J. Zico Kolter2https://orcid.org/0000-0002-8106-5759Mahyar Fazlyab3https://orcid.org/0000-0001-9695-6178University of Pennsylvania, Philadelphia, PA, USAMassachusetts Institute of Technology, Cambridge, MA, USACarnegie Mellon University, Pittsburgh, PA, USAJohns Hopkins University, Baltimore, MD, USAAnalyzing the worst-case performance of deep neural networks against input perturbations amounts to solving a large-scale non-convex optimization problem, for which several past works have proposed convex relaxations as a promising alternative. However, even for reasonably-sized neural networks, these relaxations are not tractable, and so must be replaced by even weaker relaxations in practice. In this work, we propose a novel operator splitting method that can directly solve a convex relaxation of the problem to high accuracy, by splitting it into smaller sub-problems that often have analytical solutions. The method is modular, scales to very large problem instances, and compromises of operations that are amenable to fast parallelization with GPU acceleration. We demonstrate our method in bounding the worst-case performance of large convolutional networks in image classification and reinforcement learning settings, and in reachability analysis of neural network dynamical systems.https://ieeexplore.ieee.org/document/9811356/ADMMneural network verificationoperator splitting
spellingShingle Shaoru Chen
Eric Wong
J. Zico Kolter
Mahyar Fazlyab
DeepSplit: Scalable Verification of Deep Neural Networks via Operator Splitting
IEEE Open Journal of Control Systems
ADMM
neural network verification
operator splitting
title DeepSplit: Scalable Verification of Deep Neural Networks via Operator Splitting
title_full DeepSplit: Scalable Verification of Deep Neural Networks via Operator Splitting
title_fullStr DeepSplit: Scalable Verification of Deep Neural Networks via Operator Splitting
title_full_unstemmed DeepSplit: Scalable Verification of Deep Neural Networks via Operator Splitting
title_short DeepSplit: Scalable Verification of Deep Neural Networks via Operator Splitting
title_sort deepsplit scalable verification of deep neural networks via operator splitting
topic ADMM
neural network verification
operator splitting
url https://ieeexplore.ieee.org/document/9811356/
work_keys_str_mv AT shaoruchen deepsplitscalableverificationofdeepneuralnetworksviaoperatorsplitting
AT ericwong deepsplitscalableverificationofdeepneuralnetworksviaoperatorsplitting
AT jzicokolter deepsplitscalableverificationofdeepneuralnetworksviaoperatorsplitting
AT mahyarfazlyab deepsplitscalableverificationofdeepneuralnetworksviaoperatorsplitting