Bayes model averaging with selection of regressors.
J ROY STAT SOC B
519 - 536.
When a number of distinct models contend for use in prediction, the choice of a single model can offer rather unstable predictions. In regression, stochastic search variable selection with Bayesian model averaging offers a cure for this robustness issue but at the expense of requiring very many predictors. Here we look at Bayes model averaging incorporating variable selection for prediction. This offers similar mean-square errors of prediction but with a vastly reduced predictor space. This can greatly aid the interpretation of the model. It also reduces the cost if measured variables have costs. The development here uses decision theory in the context of the multivariate general linear model. In passing, this reduced predictor space Bayes model averaging is contrasted with single-model approximations. A fast algorithm for updating regressions in the Markov chain Monte Carlo searches for posterior inference is developed, allowing many more variables than observations to be contemplated. We discuss the merits of absolute rather than proportionate shrinkage in regression, especially when there are more variables than observations. The methodology is illustrated on a set of spectroscopic data used for measuring the amounts of different sugars in an aqueous solution.
|Title:||Bayes model averaging with selection of regressors|
|Keywords:||Bayesian model averaging, decision theory, multivariate general linear model, QR-updating, ridge regression, variable selection, VARIABLE SELECTION, LINEAR-REGRESSION, RIDGE-REGRESSION, NONORTHOGONAL PROBLEMS, WAVELENGTH SELECTION, GRAPHICAL MODELS, CALIBRATION, PREDICTION, SPECTROSCOPY, UNCERTAINTY|
|UCL classification:||UCL > School of BEAMS
UCL > School of BEAMS > Faculty of Maths and Physical Sciences
Archive Staff Only