Not Just Cloud Privacy: Protecting Client Privacy in Teacher-Student Learning
Ensuring the privacy of sensitive data used to train modern machine learning models is of paramount importance in many areas of practice. One recent popular approach to study these concerns is using the differential privacy via a "teacher-student" model, wherein the teacher provides the student with useful, but noisy, information, hopefully allowing the student model to perform well on a given task. However, these studies only solve the privacy concerns of the teacher by assuming the student owns a public but unlabelled dataset. In real life, the student also has privacy concerns on its unlabelled data, so as to inquire about privacy protection on any data sent to the teacher. In this work, we re-design the privacy-preserving "teacher-student" model consisting of adopting both private arbitrary masking and local differential privacy, which protects the sensitive information of each student sample. However, the traditional training of teacher model is not robust on any perturbed data. We use the adversarial learning techniques to improve the robustness of the perturbed sample that supports returning good feedback without having all private information of each student sample. The experimental results demonstrate the effectiveness of our new privacy-preserving "teacher-student" model.
READ FULL TEXT