en
Preprint
English

A Prediction Divergence Criterion for Model Selection

Publication date2012
Abstract

In this paper, we propose a new criterion for selection between nested models. We suppose that the correct model is one (or near one) of the available models and construct a criterion which is based on the Bregman divergence between the out-of-sample prediction of the smaller model and the in-sample prediction of the larger model. This criterion, the prediction divergence criterion (PDC), is different from the ones that are often used like the AIC, BIC, Cp, in that, in a sequential approach, it directly considers the prediction divergence between two models, rather that differences between the former criteria evaluated at two different models. We derive an estimator for the PDC (PDCE) using Efron (2004) approach on parametric covariance penalty method, and for the linear model and smoothing splines, we show that the PDCE on a suitable sequence of nested models that we formalize, selects the correct model with probability 1 as the sample size tends to infinity. In finite samples, we compare the performance of our criterion to the other ones as well as to the lasso, as find that it outperforms the other criteria in terms of prediction error in sparse situations.

Keywords
  • Goodness-of-fit
  • Linear predictors
  • Stepwise selection
  • Bregman divergence
  • Covariance penalty
  • AIC
  • BIC
  • Lasso
Citation (ISO format)
GUERRIER, Stéphane, VICTORIA-FESER, Maria-Pia. A Prediction Divergence Criterion for Model Selection. 2012.
Main files (1)
Preprint
accessLevelPublic
Identifiers
  • PID : unige:24187
877views
302downloads

Technical informations

Creation10/30/2012 4:58:00 PM
First validation10/30/2012 4:58:00 PM
Update time03/14/2023 5:45:42 PM
Status update03/14/2023 5:45:42 PM
Last indexation01/16/2024 12:33:23 AM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack