AIC: Towards More Robust Model Selection in Ridge Regression

Time

-

Locations

Rettaliata Engineering Center, Room 032

Host

Department of Applied Mathematics

Speaker

Matthew Dixon
Stuart School of Business, Illinois Institute of Technology

Description

The problem of estimating the dimensionality of a model occurs in various forms in applied statistics, machine learning and econometrics. Popular approaches include Akaike’s Information Criterion (AIC), cross-validation and Minimum Description Length (MDL) among others. This talk begins by discussing Akaike’s 1973 extension of the maximum likelihood principle for model selection. In particular we revisit the rather restrictive conditions under which minimizing the AIC is equivalent to finding the minimum expected Kulllback-Leibler divergence. We then use a Wilks test to establish asymptotic convergence of the divergence to a chi-squared distribution and characterize the importance of the Fisher Information matrix in AIC regularization. We conclude by proposing a new methodology for L2 regularization in ridge regression. This is the joint work with Tyler Ward (Google).

Event Topic

Mathematical Finance, Stochastic Analysis, and Machine Learning

Tags: