Adaptive DNN Surgery for Selfish Inference Acceleration with On-demand Edge Resource

by   Xiang Yang, et al.

Deep Neural Networks (DNNs) have significantly improved the accuracy of intelligent applications on mobile devices. DNN surgery, which partitions DNN processing between mobile devices and multi-access edge computing (MEC) servers, can enable real-time inference despite the computational limitations of mobile devices. However, DNN surgery faces a critical challenge: determining the optimal computing resource demand from the server and the corresponding partition strategy, while considering both inference latency and MEC server usage costs. This problem is compounded by two factors: (1) the finite computing capacity of the MEC server, which is shared among multiple devices, leading to inter-dependent demands, and (2) the shift in modern DNN architecture from chains to directed acyclic graphs (DAGs), which complicates potential solutions. In this paper, we introduce a novel Decentralized DNN Surgery (DDS) framework. We formulate the partition strategy as a min-cut and propose a resource allocation game to adaptively schedule the demands of mobile devices in an MEC environment. We prove the existence of a Nash Equilibrium (NE), and develop an iterative algorithm to efficiently reach the NE for each device. Our extensive experiments demonstrate that DDS can effectively handle varying MEC scenarios, achieving up to 1.25× acceleration compared to the state-of-the-art algorithm.


Edge Intelligence: On-Demand Deep Learning Model Co-Inference with Device-Edge Synergy

As the backbone technology of machine learning, deep neural networks (DN...

Adaptive Scheduling for Edge-Assisted DNN Serving

Deep neural networks (DNNs) have been widely used in various video analy...

Edge AI: On-Demand Accelerating Deep Neural Network Inference via Edge Computing

As a key technology of enabling Artificial Intelligence (AI) application...

Deep Learning on Mobile Devices Through Neural Processing Units and Edge Computing

Deep Neural Network (DNN) is becoming adopted for video analytics on mob...

Slimmable Encoders for Flexible Split DNNs in Bandwidth and Resource Constrained IoT Systems

The execution of large deep neural networks (DNN) at mobile edge devices...

Autodidactic Neurosurgeon: Collaborative Deep Inference for Mobile Edge Intelligence via Online Learning

Recent breakthroughs in deep learning (DL) have led to the emergence of ...

PICO: Pipeline Inference Framework for Versatile CNNs on Diverse Mobile Devices

Recent researches in artificial intelligence have proposed versatile con...

Please sign up or login with your details

Forgot password? Click here to reset