GLAD: Neural Predicate Synthesis to Repair Omission Faults

04/14/2022
by   Sungmin Kang, et al.
0

Existing template and learning-based APR tools have successfully found patches for many benchmark faults. However, our analysis of existing results shows that omission faults pose a significant challenge to these techniques. For template based approaches, omission faults provide no location to apply templates to; for learning based approaches that formulate repair as Neural Machine Translation (NMT), omission faults similarly do not provide the faulty code to translate. To address these issues, we propose GLAD, a novel learning-based repair technique that specifically targets if-clause synthesis. GLAD does not require a faulty line as it is based on generative Language Models (LMs) instead of machine translation; consequently, it can repair omission faults. GLAD intelligently constrains the language model using a type-based grammar. Further, it efficiently reduces the validation cost by performing dynamic ranking of candidate patches using a debugger. Thanks to the shift from translation to synthesis, GLAD is highly orthogonal to existing techniques: GLAD can correctly fix 16 Defects4J v1.2 faults that previous NMT-based techniques could not, while maintaining a reasonable runtime cost, underscoring its utility as an APR tool and potential to complement existing tools in practice. An inspection of the bugs that GLAD fixes reveals that GLAD can quickly generate expressions that would be challenging for other techniques.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset