Degrees of Freedom: Search Cost and Self-consistency

08/25/2023
by   Lijun Wang, et al.
0

Model degrees of freedom () is a fundamental concept in statistics because it quantifies the flexibility of a fitting procedure and is indispensable in model selection. The is often intuitively equated with the number of independent variables in the fitting procedure. But for adaptive regressions that perform variable selection (e.g., the best subset regressions), the model is larger than the number of selected variables. The excess part has been defined as the search degrees of freedom () to account for model selection. However, this definition is limited since it does not consider fitting procedures in augmented space, such as splines and regression trees; and it does not use the same fitting procedure for and . For example, the lasso's is defined through the relaxed lasso's instead of the lasso's . Here we propose a modified search degrees of freedom () to directly account for the cost of searching in the original or augmented space. Since many fitting procedures can be characterized by a linear operator, we define the search cost as the effort to determine such a linear operator. When we construct a linear operator for the lasso via the iterative ridge regression, offers a new perspective for its search cost. For some complex procedures such as the multivariate adaptive regression splines (MARS), the search cost needs to be pre-determined to serve as a tuning parameter for the procedure itself, but it might be inaccurate. To investigate the inaccurate pre-determined search cost, we develop two concepts, nominal and actual , and formulate a property named self-consistency when there is no gap between the nominal and the actual .

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/12/2013

When Does More Regularization Imply Fewer Degrees of Freedom? Sufficient Conditions and Counter Examples from Lasso and Ridge Regression

Regularization aims to improve prediction performance of a given statist...
research
11/22/2019

On the use of information criteria for subset selection in least squares regression

Least squares (LS) based subset selection methods are popular in linear ...
research
05/22/2012

A lasso for hierarchical interactions

We add a set of convex constraints to the lasso to produce sparse intera...
research
06/06/2018

Degrees of Freedom and Model Selection for kmeans Clustering

This paper investigates the problem of model selection for kmeans cluste...
research
02/01/2023

Fitting the Distribution of Linear Combinations of t-Variables with more than 2 Degrees of Freedom

The linear combination of Student's t random variables (RVs) appears in ...
research
04/14/2022

On Measuring Model Complexity in Heteroscedastic Linear Regression

Heteroscedasticity is common in real world applications and is often han...
research
10/11/2021

yaglm: a Python package for fitting and tuning generalized linear models that supports structured, adaptive and non-convex penalties

The yaglm package aims to make the broader ecosystem of modern generaliz...

Please sign up or login with your details

Forgot password? Click here to reset