language-icon Old Web
English
Sign In

Constraint Caching Revisited

2020 
Satisfiability Modulo Theories (SMT) solvers play a major role in the success of symbolic execution as program analysis technique. However, often they are still the main performance bottleneck. One approach to improve SMT performance is to use caching. The key question we consider here is whether caching strategies are still worthwhile given the performance improvements in SMT solvers. Two main caching strategies exist: either simple sat/unsat results are stored, or entire solutions (=models) are stored for later reuse. We implement both approaches in the Green framework and compare them with the popular Z3 constraint solver. We focus on linear integer arithmetic constraints; this is typical for symbolic execution, and both caching strategies and constraint solvers work well in this domain. We use both classic symbolic and concolic execution to see whether caching behaves differently in these settings. We consider only time consumption; memory use is typically negligible. Our results suggest that (1) the key to caching performance is factoring constraints into independent parts, and this by itself is often sufficient, (2) Z3’s incremental mode often outperforms caching; and (3) reusing models fares better for concolic than for classic symbolic execution.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    0
    Citations
    NaN
    KQI
    []