How to 0wn NAS in Your Spare Time

by   Sanghyun Hong, et al.

New data processing pipelines and novel network architectures increasingly drive the success of deep learning. In consequence, the industry considers top-performing architectures as intellectual property and devotes considerable computational resources to discovering such architectures through neural architecture search (NAS). This provides an incentive for adversaries to steal these novel architectures; when used in the cloud, to provide Machine Learning as a Service, the adversaries also have an opportunity to reconstruct the architectures by exploiting a range of hardware side channels. However, it is challenging to reconstruct novel architectures and pipelines without knowing the computational graph (e.g., the layers, branches or skip connections), the architectural parameters (e.g., the number of filters in a convolutional layer) or the specific pre-processing steps (e.g. embeddings). In this paper, we design an algorithm that reconstructs the key components of a novel deep learning system by exploiting a small amount of information leakage from a cache side-channel attack, Flush+Reload. We use Flush+Reload to infer the trace of computations and the timing for each computation. Our algorithm then generates candidate computational graphs from the trace and eliminates incompatible candidates through a parameter estimation process. We implement our algorithm in PyTorch and Tensorflow. We demonstrate experimentally that we can reconstruct MalConv, a novel data pre-processing pipeline for malware detection, and ProxylessNAS- CPU, a novel network architecture for the ImageNet classification optimized to run on CPUs, without knowing the architecture family. In both cases, we achieve 0 channels are a practical attack vector against MLaaS, and more efforts should be devoted to understanding their impact on the security of deep learning systems.


page 1

page 2

page 3

page 4


POPNASv3: a Pareto-Optimal Neural Architecture Search Solution for Image and Time Series Classification

The automated machine learning (AutoML) field has become increasingly re...

Stealing Neural Networks via Timing Side Channels

Deep learning is gaining importance in many applications and Cloud infra...

Neural Architecture Transfer 2: A Paradigm for Improving Efficiency in Multi-Objective Neural Architecture Search

Deep learning is increasingly impacting various aspects of contemporary ...

Neural Architecture Search for Dense Prediction Tasks in Computer Vision

The success of deep learning in recent years has lead to a rising demand...

Neural Architecture Search in Embedding Space

The neural architecture search (NAS) algorithm with reinforcement learni...

A Novel Sleep Stage Classification Using CNN Generated by an Efficient Neural Architecture Search with a New Data Processing Trick

With the development of automatic sleep stage classification (ASSC) tech...

Generic Attacks against Cryptographic Hardware through Long-Range Deep Learning

Hardware-based cryptographic implementations utilize countermeasures to ...

Please sign up or login with your details

Forgot password? Click here to reset