DeepBase: Deep Inspection of Neural Networks

08/13/2018
by   Thibault Sellam, et al.
0

Although deep learning models perform remarkably across a range of tasks such as language translation, parsing, and object recognition, it remains unclear whether, and to what extent, these models follow human-understandable logic or procedures when making predictions. Understanding this can lead to more interpretable models, better model design, and faster experimentation. Recent machine learning research has leveraged statistical methods to identify hidden units that behave (e.g., activate) similarly to human understandable logic such as detecting language features, however each analysis requires considerable manual effort. Our insight is that, from a query processing perspective, this high level logic is a query evaluated over a database of neural network hidden unit behaviors. This paper describes DeepBase, a system to inspect neural network behaviors through a query-based interface. We model high-level logic as hypothesis functions that transform an input dataset into time series signals. DeepBase lets users quickly identify individual or groups of units that have strong statistical dependencies with desired hypotheses. In fact, we show how many existing analyses are expressible as a single DeepBase query. We use DeepBase to analyze recurrent neural network models, and propose a set of simple and effective optimizations to speed up existing analysis approaches by up to 413x. We also group and analyze different portions of a real-world neural translation model and show that learns syntactic structure, which is consistent with prior NLP studies, but can be performed with only 3 DeepBase queries.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2020

Facilitating SQL Query Composition and Analysis

Formulating efficient SQL queries requires several cycles of tuning and ...
research
01/31/2019

Plan-Structured Deep Neural Network Models for Query Performance Prediction

Query performance prediction, the task of predicting the latency of a qu...
research
03/28/2022

UTSA NLP at SemEval-2022 Task 4: An Exploration of Simple Ensembles of Transformers, Convolutional, and Recurrent Neural Networks

The act of appearing kind or helpful via the use of but having a feeling...
research
03/24/2021

De-specializing an HLS library for Deep Neural Networks: improvements upon hls4ml

Custom hardware accelerators for Deep Neural Networks are increasingly p...
research
09/22/2019

Analyzing Recurrent Neural Network by Probabilistic Abstraction

Neural network is becoming the dominant approach for solving many real-w...
research
07/22/2014

Deep Recurrent Neural Networks for Time Series Prediction

Ability of deep networks to extract high level features and of recurrent...
research
04/10/2019

Tea: A High-level Language and Runtime System for Automating Statistical Analysis

Though statistical analyses are centered on research questions and hypot...

Please sign up or login with your details

Forgot password? Click here to reset