A rigorous introduction for linear models

05/10/2021
by   Jun Lu, et al.
12

This note is meant to provide an introduction to linear models and the theories behind them. Our goal is to give a rigorous introduction to the readers with prior exposure to ordinary least squares. In machine learning, the output is usually a nonlinear function of the input. Deep learning even aims to find a nonlinear dependence with many layers which require a large amount of computation. However, most of these algorithms build upon simple linear models. We then describe linear models from different views and find the properties and theories behind the models. The linear model is the main technique in regression problems and the primary tool for it is the least squares approximation which minimizes a sum of squared errors. This is a natural choice when we're interested in finding the regression function which minimizes the corresponding expected squared error. We first describe ordinary least squares from three different points of view upon which we disturb the model with random noise and Gaussian noise. By Gaussian noise, the model gives rise to the likelihood so that we introduce a maximum likelihood estimator. It also develops some distribution theories for it via this Gaussian disturbance. The distribution theory of least squares will help us answer various questions and introduce related applications. We then prove least squares is the best unbiased linear model in the sense of mean squared error and most importantly, it actually approaches the theoretical limit. We end up with linear models with the Bayesian approach and beyond.

READ FULL TEXT

page 6

page 11

page 12

page 18

page 24

page 27

page 32

research
11/25/2017

On estimation of the noise variance in high-dimensional linear models

We consider the problem of recovering the unknown noise variance in the ...
research
06/09/2022

Trimmed Maximum Likelihood Estimation for Robust Learning in Generalized Linear Models

We study the problem of learning generalized linear models under adversa...
research
11/05/2018

Mixture of generalized linear models: identifiability and applications

We consider finite mixtures of generalized linear models with binary out...
research
08/27/2018

Empirical likelihood for linear models with spatial errors

For linear models with spatial errors, the empirical likelihood ratio st...
research
09/20/2021

`Basic' Generalization Error Bounds for Least Squares Regression with Well-specified Models

This note examines the behavior of generalization capabilities - as defi...
research
01/24/2019

Multi-Goal Prior Selection: A Way to Reconcile Bayesian and Classical Approaches for Random Effects Models

The two-level normal hierarchical model has played an important role in ...
research
12/10/2022

How to select an objective function using information theory

Science tests competing theories or models by evaluating the similarity ...

Please sign up or login with your details

Forgot password? Click here to reset