Secure Data Sharing With Flow Model

09/24/2020
by   Chenwei Wu, et al.
19

In the classical multi-party computation setting, multiple parties jointly compute a function without revealing their own input data. We consider a variant of this problem, where the input data can be shared for machine learning training purposes, but the data are also encrypted so that they cannot be recovered by other parties. We present a rotation based method using flow model, and theoretically justified its security. We demonstrate the effectiveness of our method in different scenarios, including supervised secure model training, and unsupervised generative model training. Our code is available at https://github.com/ duchenzhuang/flowencrypt.

READ FULL TEXT

page 2

page 6

page 8

research
07/16/2019

Helen: Maliciously Secure Coopetitive Learning for Linear Models

Many organizations wish to collaboratively train machine learning models...
research
10/02/2022

pMPL: A Robust Multi-Party Learning Framework with a Privileged Party

In order to perform machine learning among multiple parties while protec...
research
10/06/2020

Secure Collaborative Training and Inference for XGBoost

In recent years, gradient boosted decision tree learning has proven to b...
research
12/14/2018

Scaling shared model governance via model splitting

Currently the only techniques for sharing governance of a deep learning ...
research
04/20/2023

Securing Neural Networks with Knapsack Optimization

Deep learning inference brings together the data and the Convolutional N...
research
10/06/2021

Tribuo: Machine Learning with Provenance in Java

Machine Learning models are deployed across a wide range of industries, ...

Please sign up or login with your details

Forgot password? Click here to reset