The counterexamples produced by model checkers are often lengthy and difficult to understand. In practical verification, showing the existence of a (potential) bug is not enough: the error must be understood, determined to not be a result of faulty specification or assumptions, and, finally, located and corrected. The explain tool uses distance metrics on program executions to provide automated assistance in understanding and localizing errors in ANSI-C programs, explain is integrated with CBMC, a bounded model checker for the C language, and features a GUI front-end that presents error explanations to the user. © Springer-Verlag Berlin Heidelberg 2004.
CITATION STYLE
Groce, A., Kroening, D., & Lerda, F. (2004). Understanding counterexamples with explain. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3114, 453–456. https://doi.org/10.1007/978-3-540-27813-9_35
Mendeley helps you to discover research relevant for your work.