PV4 Reluplexproduces a satisfiability result for a formula

SMT solver for theory of linear real arithmetic with ReLU constraints. ReLU (Rectified Linear Unit), are a specific kind of activation function used in deep neural networks (DNNs).

Application domain/field

Type of tool

SMT solver

Expected output

SAT, UNSAT or TIMEOUT UNSAT indicates that a property holds, SAT indicates that the property does not hold.

Internals

Verifying DNNs is difficult because it is beyond the reach of general-purpose tools such as linear programming and SMT solvers. Therefore they focus on making an SMT solver that can deal with DNNs. Specifically, they focus on DNNs with a specific kind of activation function called Rectified Linear Unit (ReLU).

Comments

Note: The repository of Reluplex is no longer maintained, nowadays the algorithm is implemented in Marabou
DNN Neural network SMT

Links

Related papers

Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks (CAV '17)

Last publication date

13 July 2017

Related tools

ProVerB specific

Markdown description: view/edit



ProVerB is a part of SLEBoK. Last updated: February 2023.