Variable selection for linear models and generalized linear models with BIC-based posterior probability

Allison N. Tegge, Shuangshuang Xu, and Marco A. R. Ferreira

library(VariableSelection)

Introduction

Variable selection methods can screen and identify associated variables from regression variables. A simpler model with fewer variables is easier to understand in the real world.

The VariableSelection package uses Bayesian Information Criterion (BIC), Model Posterior Probability (MPP), and Posterior Inclusion Probability (PIP) to perform model selection for linear models (LM) and generalized linear models (GLM). The package provides the best model, BIC, and MPP for candidate models, and PIP for each predictor. This vignette contains an example to illustrate how to use VariableSelection.

Model

The linear models used in the VariableSelection package are of the form \[ \pmb{Y}=X\pmb{\beta}+ \pmb{\epsilon},\] where

The generalized linear models used in the VariableSelection package assume that the observations come from a distribution that belongs to the exponential family of distributions. In addition, these GLMs assume that the mean of the observations is related to the covariates through a link function \(g\) such that \[E(\pmb{Y}|X) = g^{-1}(X\pmb{\beta}),\] where

BIC

The Bayesian Information Criterion (BIC) is defined as:

\[ \text{BIC} = -2 \log(L) + k \log(n), \]

where

Lower value of BIC indicates a better model.

Example

Data

The modelselect.lm() function can take a data frame which contains both the response and predictors. For example, here are the first 5 rows of a data frame, where X1 through X6 are six candidate predictors. Y is the response variable which is simulated from a linear model that contains the predictors X1, X2, and X3.

data("dat")
head(dat,5)
#>           X1          X2          X3         X4         X5         X6
#> 1  0.9962431 -0.02560867 -0.61734975  1.4855638  0.6727587  0.1247850
#> 2  0.8569675 -2.08603964  1.71170398  1.3454152 -0.1355638 -1.8180708
#> 3 -0.4341982  0.77668088 -0.71365443  0.7359808 -1.4004599 -2.0857847
#> 4  1.5278373  0.96399955  0.07846497  1.1466296 -0.9196107  0.2417378
#> 5  0.5615185 -0.69806814 -0.12644657 -1.7336450  1.0308951 -0.7059752
#>            Y
#> 1 -0.4551099
#> 2  1.1381262
#> 3  0.9274077
#> 4  1.4541197
#> 5  1.1890484

The data frame dat above is attached in the VariableSelection package.

Formula

In this example, we use modelselect.lm to select the predictors in a linear model. The modelselect.lm function takes a formula and dataset as input. In the example below, we consider the model space that contains all possible linear models generated with the predictors X1, X2, X3, and X4.

example1 <- modelselect.lm(formula = Y~X1+X2+X3+X4, data = dat)
#> The Best Model:
#>  X1 X2 X3 
#> BIC:
#>  2262.126 
#> MPP:
#>  0.957

The output of modelselect.lm returns a table of BICs and MPPs of competing models and a table of PIPs of candidate predictors.

head(example1$models)
#>   X1 X2 X3 X4      BIC   MPP
#> 1  1  1  1  0 2262.126 0.957
#> 2  1  1  1  1 2268.340 0.043
#> 3  0  1  1  0 2335.625 0.000
#> 4  0  1  1  1 2341.214 0.000
#> 5  1  0  1  0 2525.717 0.000
#> 6  1  0  1  1 2531.910 0.000
example1$variables
#>      Var   PIP
#> Var1  X1 1.000
#> Var2  X2 1.000
#> Var3  X3 1.000
#> Var4  X4 0.043

Here are some additional examples on how to write a formula. In the next example, the formula ~. includes all predictors in the data frame.

example2 <- modelselect.lm(formula = Y~., data = dat)
#> The Best Model:
#>  X1 X2 X3 
#> BIC:
#>  2262.126 
#> MPP:
#>  0.869
example2$variables
#>      Var   PIP
#> Var1  X1 1.000
#> Var2  X2 1.000
#> Var3  X3 1.000
#> Var4  X4 0.043
#> Var5  X5 0.046
#> Var6  X6 0.049

Interactions

The next example includes an interaction term between the predictors X1 and X2.

example3 <- modelselect.lm(formula = Y~X1*X2+X3+X4+X5+X6, data = dat)
#> The Best Model:
#>  X1 X2 X3 
#> BIC:
#>  2262.126 
#> MPP:
#>  0.832
example3$variables
#>        Var   PIP
#> Var1    X1 1.000
#> Var2    X2 1.000
#> Var3    X3 1.000
#> Var4    X4 0.043
#> Var5    X5 0.046
#> Var6    X6 0.049
#> Var7 X1:X2 0.043

Note that because modelselect.lm uses lm notation for formulas, we could also have used notation X1+X2+X3+X4+X5+X6+X1:X2.

Functions

In the VariableSelection package, there are four functions: modelselect.lm() and lm.best() for LM, and modelselect.glm() and glm.best() for GLM.

Reference

Xu, Shuangshuang, Tegge, Allison, and Ferreira, M. A. R. (202X). paper, Journal, .