Table of Contents
Fetching ...

NARRepair: Non-Autoregressive Code Generation Model for Automatic Program Repair

Zhenyu Yang, Zhen Yang, Zhongxing Yu

TL;DR

The results show that the NARRepair technique can significantly improve the inference speed while maintaining high repair accuracy, and the results show that the technique can significantly improve the inference speed while maintaining high repair accuracy.

Abstract

With the advancement of deep learning techniques, the performance of Automatic Program Repair(APR) techniques has reached a new level. Previous deep learning-based APR techniques essentially modified program sentences in the Autoregressive(AR) manner, which predicts future values based on past values. Due to the manner of word-by-word generation, the AR-based APR technique has a huge time delay. This negative consequence overshadows the widespread adoption of APR techniques in real-life software development. To address the issue, we aim to apply the Non-Autoregressive(NAR) method to the APR task, which can output target code in a parallel manner to avoid huge inference delays. To effectively adapt the NAR manner for the APR task, we in this paper propose NARRepair, the first customized NAR code generation model for the APR task. The NARRepair features three major novelties, including 1) using repair actions to alleviate the over-correction issue, 2) extracting dependency information from AST to alleviate the issue of lacking inter-word dependency information, 3) employing two-stage decoding to alleviate the issue of lacking contextual information. We evaluated NARRepair on three widely used datasets in the APR community, and the results show that our technique can significantly improve the inference speed while maintaining high repair accuracy.

NARRepair: Non-Autoregressive Code Generation Model for Automatic Program Repair

TL;DR

The results show that the NARRepair technique can significantly improve the inference speed while maintaining high repair accuracy, and the results show that the technique can significantly improve the inference speed while maintaining high repair accuracy.

Abstract

With the advancement of deep learning techniques, the performance of Automatic Program Repair(APR) techniques has reached a new level. Previous deep learning-based APR techniques essentially modified program sentences in the Autoregressive(AR) manner, which predicts future values based on past values. Due to the manner of word-by-word generation, the AR-based APR technique has a huge time delay. This negative consequence overshadows the widespread adoption of APR techniques in real-life software development. To address the issue, we aim to apply the Non-Autoregressive(NAR) method to the APR task, which can output target code in a parallel manner to avoid huge inference delays. To effectively adapt the NAR manner for the APR task, we in this paper propose NARRepair, the first customized NAR code generation model for the APR task. The NARRepair features three major novelties, including 1) using repair actions to alleviate the over-correction issue, 2) extracting dependency information from AST to alleviate the issue of lacking inter-word dependency information, 3) employing two-stage decoding to alleviate the issue of lacking contextual information. We evaluated NARRepair on three widely used datasets in the APR community, and the results show that our technique can significantly improve the inference speed while maintaining high repair accuracy.

Paper Structure

This paper contains 26 sections, 11 equations, 9 figures, 6 tables.

Figures (9)

  • Figure 1: The inference process of autoregressive (AR) model and non-autoregressive (NAR) model.
  • Figure 2: An overview of the NARRepair architecture.
  • Figure 3: A bug (Lang 61 from Defects4J) fixed by NARRepair with repair action predictor.
  • Figure 4: The example of generating AST and inter-word dependency matrix for the text "int add(int a, int b) {return a+b;}".
  • Figure 5: Structure of the Inter-word Dependency Extractor.
  • ...and 4 more figures