Secure Shapley Value for Cross-Silo Federated Learning

09/11/2022
by   Shuyuan Zheng, et al.
0

The Shapley value (SV) is a fair and principled metric for contribution evaluation in cross-silo federated learning (cross-silo FL), in which organizations, i.e., clients, collaboratively train prediction models with the coordination of a parameter server. However, existing SV calculation methods for FL assume that the server can access the raw FL models and public test data, which might not be a valid assumption in practice given the emerging privacy attacks on FL models and that test data might be clients' private assets. Hence, in this paper, we investigate the problem of secure SV calculation for cross-silo FL. We first propose a one-server solution, HESV, which is based solely on homomorphic encryption (HE) for privacy protection and has some considerable limitations in efficiency. To overcome these limitations, we further propose an efficient two-server protocol, SecSV, which has the following novel features. First, SecSV utilizes a hybrid privacy protection scheme to avoid ciphertext-ciphertext multiplications between test data and models, which are extremely expensive under HE. Second, a more efficient secure matrix multiplication method is proposed for SecSV. Third, SecSV strategically identifies and skips some test samples without significantly affecting the evaluation accuracy. Our experiments demonstrate that SecSV is 5.9-18.0 times as fast as HESV while sacrificing a limited loss in the accuracy of calculated SVs.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset