GuardNN: Secure DNN Accelerator for Privacy-Preserving Deep Learning

08/26/2020
by   Weizhe Hua, et al.
0

This paper proposes GuardNN, a secure deep neural network (DNN) accelerator, which provides strong hardware-based protection for user data and model parameters even in an untrusted environment. GuardNN shows that the architecture and protection can be customized for a specific application to provide strong confidentiality and integrity protection with negligible overhead. The design of the GuardNN instruction set reduces the TCB to just the accelerator and enables confidentiality protection without the overhead of integrity protection. GuardNN also introduces a new application-specific memory protection scheme to minimize the overhead of memory encryption and integrity verification. The scheme shows that most of the off-chip meta-data in today's state-of-the-art memory protection can be removed by exploiting the known memory access patterns of a DNN accelerator. GuardNN is implemented as an FPGA prototype, which demonstrates effective protection with less than 2 performance overhead for inference over a variety of modern DNN models.

READ FULL TEXT
research
04/20/2020

MgX: Near-Zero Overhead Memory Protection with an Application to Secure DNN Acceleration

In this paper, we propose MgX, a near-zero overhead memory protection sc...
research
03/22/2022

NNReArch: A Tensor Program Scheduling Framework Against Neural Network Architecture Reverse Engineering

Architecture reverse engineering has become an emerging attack against d...
research
03/11/2018

The Secure Machine: Efficient Secure Execution On Untrusted Platforms

In this work we present the Secure Machine, SeM for short, a CPU archite...
research
07/31/2023

PATRONoC: Parallel AXI Transport Reducing Overhead for Networks-on-Chip targeting Multi-Accelerator DNN Platforms at the Edge

Emerging deep neural network (DNN) applications require high-performance...
research
05/31/2020

Cheetah: Optimizations and Methods for PrivacyPreserving Inference via Homomorphic Encryption

As the application of deep learning continues to grow, so does the amoun...
research
12/28/2020

IRO: Integrity and Reliability Enhanced Ring ORAM

Memory security and reliability are two of the major design concerns in ...
research
09/26/2022

FastStamp: Accelerating Neural Steganography and Digital Watermarking of Images on FPGAs

Steganography and digital watermarking are the tasks of hiding recoverab...

Please sign up or login with your details

Forgot password? Click here to reset