A Practical Scheme for Two-Party Private Linear Least Squares

01/26/2019
by   Mohamed Nassar, et al.
0

Privacy-preserving machine learning is learning from sensitive datasets that are typically distributed across multiple data owners. Private machine learning is a remarkable challenge in a large number of realistic scenarios where no trusted third party can play the role of a mediator. The strong decentralization aspect of these scenarios requires tools from cryptography as well as from distributed systems communities. In this paper, we present a practical scheme that is suitable for a subclass of machine learning algorithms and investigate the possibility of conducting future research. We present a scheme to learn a linear least squares model across two parties using a gradient descent approach and additive homomorphic encryption. The protocol requires two rounds of communication per step of gradient descent. We detail our approach including a fixed point encoding scheme, and one time random pads for hiding intermediate results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset