DUET: A Tuning-Free Device-Cloud Collaborative Parameters Generation Framework for Efficient Device Model Generalization

09/12/2022
by   Zheqi Lv, et al.
12

Device Model Generalization (DMG) is a practical yet under-investigated research topic for on-device machine learning applications. It aims to improve the generalization ability of pre-trained models when deployed on resource-constrained devices, such as improving the performance of pre-trained cloud models on smart mobiles. While quite a lot of works have investigated the data distribution shift across clouds and devices, most of them focus on model fine-tuning on personalized data for individual devices to facilitate DMG. Despite their promising, these approaches require on-device re-training, which is practically infeasible due to the overfitting problem and high time delay when performing gradient calculation on real-time data. In this paper, we argue that the computational cost brought by fine-tuning can be rather unnecessary. We consequently present a novel perspective to improving DMG without increasing computational cost, i.e., device-specific parameter generation which directly maps data distribution to parameters. Specifically, we propose an efficient Device-cloUd collaborative parametErs generaTion framework DUET. DUET is deployed on a powerful cloud server that only requires the low cost of forwarding propagation and low time delay of data transmission between the device and the cloud. By doing so, DUET can rehearse the device-specific model weight realizations conditioned on the personalized real-time data for an individual device. Importantly, our DUET elegantly connects the cloud and device as a 'duet' collaboration, frees the DMG from fine-tuning, and enables a faster and more accurate DMG paradigm. We conduct an extensive experimental study of DUET on three public datasets, and the experimental results confirm our framework's effectiveness and generalisability for different DMG tasks.

READ FULL TEXT
research
02/14/2023

IDEAL: Toward High-efficiency Device-Cloud Collaborative and Dynamic Recommendation System

Recommendation systems have shown great potential to solve the informati...
research
11/30/2022

An Efficient Split Fine-tuning Framework for Edge and Cloud Collaborative Learning

To enable the pre-trained models to be fine-tuned with local data on edg...
research
04/14/2021

Device-Cloud Collaborative Learning for Recommendation

With the rapid development of storage and computing power on mobile devi...
research
10/21/2022

On-Device Model Fine-Tuning with Label Correction in Recommender Systems

To meet the practical requirements of low latency, low cost, and good pr...
research
03/18/2023

DC-CCL: Device-Cloud Collaborative Controlled Learning for Large Vision Models

Many large vision models have been deployed on the cloud for real-time s...
research
04/14/2021

Towards Unsupervised Fine-Tuning for Edge Video Analytics

Judging by popular and generic computer vision challenges, such as the I...
research
01/24/2022

On-Device Learning with Cloud-Coordinated Data Augmentation for Extreme Model Personalization in Recommender Systems

Data heterogeneity is an intrinsic property of recommender systems, maki...

Please sign up or login with your details

Forgot password? Click here to reset