PV1 PrIC3checks if the Markov decision process is safe

Symbolic model checking of Markov decision processes (MDPs).

Application domain/field

Type of tool

Probabilistic model checker

Expected input

Format:

Expected output

true or false. If the tool returns false, it will provide a possible counterexample. This can be caused by two reasons: either the MDP is indeed unsafe, or the heuristic was inappropriate. The counterexample consists of a subsystem of states of the MDP witnessing Prmax(sIB)>λ

Internals

Extension of IC3, for quantitative reachability in MDPs. Uses efficient data structures from Storm and uses Z3.

Comments

It is called a 'prototypical implementation' in the CAV '20 paper.
Model checking Probabilistic

Links

Repository: https://github.com/moves-rwth/PrIC3

Related papers

PrIC3: Property Directed Reachability for MDPs

Last publication date

14 July 2020

ProVerB specific



ProVerB is a part of SLEBoK. Last updated: February 2023.