PV4 ⊧ Reluplexproduces a satisfiability result for a formula
SMT solver for theory of linear real arithmetic with ReLU constraints. ReLU (Rectified Linear Unit), are a specific kind of activation function used in deep neural networks (DNNs).Application domain/field
- Neural networks
- Deep neural networks (DNNs)
- Activation functions
- SMT solving
Type of tool
SMT solverExpected output
SAT
, UNSAT
or TIMEOUT
UNSAT
indicates that a property holds, SAT
indicates that the property does not hold.