On the minmax regret for statistical manifolds: the role of curvature

07/06/2020
by   Bruno Mera, et al.
0

Model complexity plays an essential role in its selection, namely, by choosing a model that fits the data and is also succinct. Two-part codes and the minimum description length have been successful in delivering procedures to single out the best models, avoiding overfitting. In this work, we pursue this approach and complement it by performing further assumptions in the parameter space. Concretely, we assume that the parameter space is a smooth manifold, and by using tools of Riemannian geometry, we derive a sharper expression than the standard one given by the stochastic complexity, where the scalar curvature of the Fisher information metric plays a dominant role. Furthermore, we derive the minmax regret for general statistical manifolds and apply our results to derive optimal dimensional reduction in the context of principal component analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2019

Lightlike Neuromanifolds, Occam's Razor and Deep Learning

Why do deep neural networks generalize with a very high dimensional para...
research
02/17/2023

Minimizing Dynamic Regret on Geodesic Metric Spaces

In this paper, we consider the sequential decision problem where the goa...
research
05/12/2020

Fisher-Rao geometry of Dirichlet distributions

In this paper, we study the geometry induced by the Fisher-Rao metric on...
research
10/13/2014

Ricci Curvature and the Manifold Learning Problem

Consider a sample of n points taken i.i.d from a submanifold Σ of Euclid...
research
08/04/2023

An Intrinsic Approach to Scalar-Curvature Estimation for Point Clouds

We introduce an intrinsic estimator for the scalar curvature of a data s...
research
12/07/2021

Towards Modeling and Resolving Singular Parameter Spaces using Stratifolds

When analyzing parametric statistical models, a useful approach consists...

Please sign up or login with your details

Forgot password? Click here to reset