Towards Secure and Practical Machine Learning via Secret Sharing and Random Permutation

08/17/2021
by   Fei Zheng, et al.
0

With the increasing demands for privacy protection, privacy-preserving machine learning has been drawing much attention in both academia and industry. However, most existing methods have their limitations in practical applications. On the one hand, although most cryptographic methods are provable secure, they bring heavy computation and communication. On the other hand, the security of many relatively efficient private methods (e.g., federated learning and split learning) is being questioned, since they are non-provable secure. Inspired by previous work on privacy-preserving machine learning, we build a privacy-preserving machine learning framework by combining random permutation and arithmetic secret sharing via our compute-after-permutation technique. Since our method reduces the cost for element-wise function computation, it is more efficient than existing cryptographic methods. Moreover, by adopting distance correlation as a metric for privacy leakage, we demonstrate that our method is more secure than previous non-provable secure methods. Overall, our proposal achieves a good balance between security and efficiency. Experimental results show that our method not only is up to 6x faster and reduces up to 85 network traffic compared with state-of-the-art cryptographic methods, but also leaks less privacy during the training process compared with non-provable secure methods.

READ FULL TEXT

page 8

page 10

page 12

research
08/18/2020

Efficient Private Machine Learning by Differentiable Random Transformations

With the increasing demands for privacy protection, many privacy-preserv...
research
01/19/2022

Scotch: An Efficient Secure Computation Framework for Secure Aggregation

Federated learning enables multiple data owners to jointly train a machi...
research
05/12/2021

An Efficient Learning Framework For Federated XGBoost Using Secret Sharing And Distributed Optimization

XGBoost is one of the most widely used machine learning models in the in...
research
07/09/2023

Towards Fast and Scalable Private Inference

Privacy and security have rapidly emerged as first order design constrai...
research
04/16/2023

Shuffled Transformer for Privacy-Preserving Split Learning

In conventional split learning, training and testing data often face sev...
research
05/09/2020

Cloud-based Federated Boosting for Mobile Crowdsensing

The application of federated extreme gradient boosting to mobile crowdse...
research
08/19/2023

East: Efficient and Accurate Secure Transformer Framework for Inference

Transformer has been successfully used in practical applications, such a...

Please sign up or login with your details

Forgot password? Click here to reset