FDAPT: Federated Domain-adaptive Pre-training for Language Models

07/12/2023
by   Lekang Jiang, et al.
0

Combining Domain-adaptive Pre-training (DAPT) with Federated Learning (FL) can enhance model adaptation by leveraging more sensitive and distributed data while preserving data privacy. However, few studies have focused on this method. Therefore, we conduct the first comprehensive empirical study to evaluate the performance of Federated Domain-adaptive Pre-training (FDAPT). We demonstrate that FDAPT can maintain competitive downstream task performance to the centralized baseline in both IID and non-IID situations. Furthermore, we propose a novel algorithm, Frozen Federated Domain-adaptive Pre-training (FFDAPT). FFDAPT improves the computational efficiency by 12.1 exhibits similar downstream task performance to standard FDAPT, with general performance fluctuations remaining less than 1 evaluation of our work, we identify promising future research directions for this new research area.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset