Representational dissimilarity metric spaces for stochastic neural networks

11/21/2022
by   Lyndon R. Duong, et al.
0

Quantifying similarity between neural representations – e.g. hidden layer activation vectors – is a perennial problem in deep learning and neuroscience research. Existing methods compare deterministic responses (e.g. artificial networks that lack stochastic layers) or averaged responses (e.g., trial-averaged firing rates in biological data). However, these measures of deterministic representational similarity ignore the scale and geometric structure of noise, both of which play important roles in neural computation. To rectify this, we generalize previously proposed shape metrics (Williams et al. 2021) to quantify differences in stochastic representations. These new distances satisfy the triangle inequality, and thus can be used as a rigorous basis for many supervised and unsupervised analyses. Leveraging this novel framework, we find that the stochastic geometries of neurobiological representations of oriented visual gratings and naturalistic scenes respectively resemble untrained and trained deep network representations. Further, we are able to more accurately predict certain network attributes (e.g. training hyperparameters) from its position in stochastic (versus deterministic) shape space.

READ FULL TEXT

page 8

page 17

page 19

page 21

research
10/27/2021

Generalized Shape Metrics on Neural Representations

Understanding the operation of biological and artificial networks remain...
research
05/28/2018

A neural network trained to predict future video frames mimics critical properties of biological neuronal responses and perception

While deep neural networks take loose inspiration from neuroscience, it ...
research
02/14/2020

An Inductive Bias for Distances: Neural Nets that Respect the Triangle Inequality

Distances are pervasive in machine learning. They serve as similarity me...
research
06/22/2022

Neural Networks as Paths through the Space of Representations

Deep neural networks implement a sequence of layer-by-layer operations t...
research
10/06/2020

Usable Information and Evolution of Optimal Representations During Training

We introduce a notion of usable information contained in the representat...
research
03/08/2021

Statistical Neuroscience in the Single Trial Limit

Individual neurons often produce highly variable responses over nominall...

Please sign up or login with your details

Forgot password? Click here to reset