Multilingual Adapter-based Knowledge Aggregation on Code Summarization for Low-Resource Languages

07/15/2023
by   Iman Saberi, et al.
0

Multilingual fine-tuning (of a multilingual Pre-trained Language Model) has shown to improve performance of downstream tasks. However, it was observed that different programming languages may have different structural properties, and thus the learning or fine-tuning of a model may be sub-optimal or even degrade the intended performance by using a multilingual dataset. In this study, we proposed a new modular component architecture, AdvFusion, that leverages the different aspects of programming languages for a target popular low-resource programming language, Ruby. Our result shows that AdvFusion can extract useful features from different programming languages efficiently, and it outperforms the existing state-of-the-art multilingual fine-tuning by 12 Summarization task.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro