A theorem of Kalman and minimal state-space realization of Vector Autoregressive Models

10/06/2019
by   Du Nguyen, et al.
0

We introduce a concept of autoregressive (AR)state-space realization that could be applied to all transfer functions T(L) with T(0) invertible. We show that a theorem of Kalman implies each Vector Autoregressive model (with exogenous variables) has a minimal AR-state-space realization of form y_t = ∑_i=1^pHF^i-1Gx_t-i+ϵ_t where F is a nilpotent Jordan matrix and H, G satisfy certain rank conditions. The case VARX(1) corresponds to reduced-rank regression. Similar to that case, for a fixed Jordan form F, H could be estimated by least square as a function of G. The likelihood function is a determinant ratio generalizing the Rayleigh quotient. It is unchanged if G is replaced by SG for an invertible matrix S commuting with F. Using this invariant property, the search space for maximum likelihood estimate could be constrained to equivalent classes of matrices satisfying a number of orthogonal relations, extending the results in reduced-rank analysis. Our results could be considered a multi-lag canonical-correlation-analysis. The method considered here provides a solution in the general case to the polynomial product regression model of Velu et. al. We provide estimation examples. We also explore how the estimates vary with different Jordan matrix configurations and discuss methods to select a configuration. Our approach could provide an important dimensional reduction technique with potential applications in time series analysis and linear system identification. In the appendix, we link the reduced configuration space of G with a geometric object called a vector bundle.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset