Neural Persistence: A Complexity Measure for Deep Neural Networks Using Algebraic Topology

12/23/2018
by   Bastian Rieck, et al.
16

While many approaches to make neural networks more fathomable have been proposed, they are restricted to interrogating the network with input data. Measures for characterizing and monitoring structural properties, however, have not been developed. In this work, we propose neural persistence, a complexity measure for neural network architectures based on topological data analysis on weighted stratified graphs. To demonstrate the usefulness of our approach, we show that neural persistence reflects best practices developed in the deep learning community such as dropout and batch normalization. Moreover, we derive a neural persistence-based stopping criterion that shortens the training process while achieving comparable accuracies as early stopping based on validation loss.

READ FULL TEXT

page 15

page 16

research
07/20/2023

Addressing caveats of neural persistence with deep graph persistence

Neural Persistence is a prominent measure for quantifying neural network...
research
12/28/2022

Persistence-based operators in machine learning

Artificial neural networks can learn complex, salient data features to a...
research
02/13/2018

On Characterizing the Capacity of Neural Networks using Algebraic Topology

The learnability of different neural architectures can be characterized ...
research
03/23/2022

Towards explaining the generalization gap in neural networks using topological data analysis

Understanding how neural networks generalize on unseen data is crucial f...
research
03/01/2021

Statistically Significant Stopping of Neural Network Training

The general approach taken when training deep learning classifiers is to...
research
11/29/2021

Being Patient and Persistent: Optimizing An Early Stopping Strategy for Deep Learning in Profiled Attacks

The absence of an algorithm that effectively monitors deep learning mode...
research
05/07/2021

Topological Uncertainty: Monitoring trained neural networks through persistence of activation graphs

Although neural networks are capable of reaching astonishing performance...

Please sign up or login with your details

Forgot password? Click here to reset