Too much information: CDCL solvers need to forget and perform restarts
Conflict-driven clause learning (CDCL) is a remarkably successful paradigm for solving the satisfiability problem of propositional logic. Instead of a simple depth-first backtracking approach, this kind of solver learns the reason behind occurring conflicts in the form of additional clauses. However, despite the enormous success of CDCL solvers, there is still only a shallow understanding of what influences the performance of these solvers in what way. This paper will demonstrate, quite surprisingly, that clause learning (without being able to get rid of some clauses) can not only improve the runtime but can oftentimes deteriorate it dramatically. By conducting extensive empirical analysis, we find that the runtime distributions of CDCL solvers are multimodal. This multimodality can be seen as a reason for the deterioration phenomenon described above. Simultaneously, it also gives an indication of why clause learning in combination with clause deletion and restarts is virtually the de facto standard of SAT solving in spite of this phenomenon. As a final contribution, we will show that Weibull mixture distributions can accurately describe the multimodal distributions. Thus, adding new clauses to a base instance has an inherent effect of making runtimes long-tailed. This insight provides a theoretical explanation as to why the techniques of restarts and clause deletion are useful in CDCL solvers.
READ FULL TEXT