Title: | Degrees of Freedom of Elastic Net, Adaptive Lasso and Generalized Elastic Net |
---|---|
Description: | Computes the degrees of freedom of the lasso, elastic net, generalized elastic net and adaptive lasso based on the generalized path seeking algorithm. The optimal model can be selected by model selection criteria including Mallows' Cp, bias-corrected AIC (AICc), generalized cross validation (GCV) and BIC. |
Authors: | Kei Hirose |
Maintainer: | Kei Hirose <[email protected]> |
License: | GPL (>= 2) |
Version: | 1.3.5 |
Built: | 2024-11-04 19:52:44 UTC |
Source: | https://github.com/cran/msgps |
This package computes the degrees of freedom of the lasso, elastic net, generalized elastic net and adaptive lasso based on the generalized path seeking algorithm. The optimal model can be selected by model selection criteria including Mallows' Cp, bias-corrected AIC (AICc), generalized cross validation (GCV) and BIC.
msgps(X,y,penalty="enet", alpha=0, gamma=1, lambda=0.001, tau2, STEP=20000, STEP.max=200000, DFtype="MODIFIED", p.max=300, intercept=TRUE, stand.coef=FALSE)
msgps(X,y,penalty="enet", alpha=0, gamma=1, lambda=0.001, tau2, STEP=20000, STEP.max=200000, DFtype="MODIFIED", p.max=300, intercept=TRUE, stand.coef=FALSE)
X |
predictor matrix |
y |
response vector |
penalty |
The penalty term. The
Note that
The
where |
alpha |
The value of |
gamma |
The value of |
lambda |
The value of regularization parameter |
tau2 |
Estimator of error variance for Mallows' Cp. The default is the unbiased estimator of error vairance of the most complex model. When the unbiased estimator of error vairance of the most complex model is not available (e.g., the number of variables exceeds the number of samples), |
STEP |
The approximate number of steps. |
STEP.max |
The number of steps in this algorithm can often exceed |
DFtype |
|
p.max |
If the number of selected variables exceeds |
intercept |
When intercept is |
stand.coef |
When stand.coef is |
Kei Hirose
[email protected]
Friedman, J. (2008). Fast sparse regression and classification. , Standford University.
Hirose, K., Tateishi, S. and Konishi, S.. (2011). Efficient algorithm to select tuning parameters in sparse regression modeling with regularization. arXiv:1109.2411 (arXiv).
coef.msgps
, plot.msgps
, predict.msgps
and summary.msgos
objects.
#data X <- matrix(rnorm(100*8),100,8) beta0 <- c(3,1.5,0,0,2,0,0,0) epsilon <- rnorm(100,sd=3) y <- X %*% beta0 + epsilon y <- c(y) #lasso fit <- msgps(X,y) summary(fit) coef(fit) #extract coefficients at t selected by model selection criteria coef(fit,c(0, 0.5, 2.5)) #extract coefficients at some values of t predict(fit,X[1:10,]) #predict values at t selected by model selection criteria predict(fit,X[1:10,],c(0, 0.5, 2.5)) #predict values at some values of t plot(fit,criterion="cp") #plot the solution path with a model selected by Cp criterion #elastic net fit2 <- msgps(X,y,penalty="enet",alpha=0.5) summary(fit2) #generalized elastic net fit3 <- msgps(X,y,penalty="genet",alpha=0.5) summary(fit3) #adaptive lasso fit4 <- msgps(X,y,penalty="alasso",gamma=1,lambda=0) summary(fit4)
#data X <- matrix(rnorm(100*8),100,8) beta0 <- c(3,1.5,0,0,2,0,0,0) epsilon <- rnorm(100,sd=3) y <- X %*% beta0 + epsilon y <- c(y) #lasso fit <- msgps(X,y) summary(fit) coef(fit) #extract coefficients at t selected by model selection criteria coef(fit,c(0, 0.5, 2.5)) #extract coefficients at some values of t predict(fit,X[1:10,]) #predict values at t selected by model selection criteria predict(fit,X[1:10,],c(0, 0.5, 2.5)) #predict values at some values of t plot(fit,criterion="cp") #plot the solution path with a model selected by Cp criterion #elastic net fit2 <- msgps(X,y,penalty="enet",alpha=0.5) summary(fit2) #generalized elastic net fit3 <- msgps(X,y,penalty="genet",alpha=0.5) summary(fit3) #adaptive lasso fit4 <- msgps(X,y,penalty="alasso",gamma=1,lambda=0) summary(fit4)
This functions predicts fitted values from a "msgps" object.
## S3 method for class 'msgps' plot(x, criterion="cp", xvar="norm", yvar="coef", yvar.dflasso=TRUE, stand.coef=TRUE, plot.step = 1000, col=TRUE,...)
## S3 method for class 'msgps' plot(x, criterion="cp", xvar="norm", yvar="coef", yvar.dflasso=TRUE, stand.coef=TRUE, plot.step = 1000, col=TRUE,...)
x |
Fitted |
criterion |
The code |
xvar |
The type of x variable. |
yvar |
The type of y variable. |
yvar.dflasso |
For lasso penalty, the degrees of freedom of the lasso (the number of non-zero parameters) is given when |
stand.coef |
The standardized coefficients and tuning parameters are dipicted if "stand.coef=TRUE". |
plot.step |
The number of steps to plot the solution of df. As |
col |
The color option. |
... |
Other graphical parameters to plot |
The object returned depends on type.
Kei Hirose
[email protected]
coef.msgps
, predict.msgps
and summary.msgps
objects.
#data X <- matrix(rnorm(100*8),100,8) beta0 <- c(3,1.5,0,0,2,0,0,0) epsilon <- rnorm(100,sd=3) y <- X %*% beta0 + epsilon y <- c(y) #fit fit <- msgps(X,y) plot(fit,criterion="cp") #plot the solution path with a model selected by Cp criterion
#data X <- matrix(rnorm(100*8),100,8) beta0 <- c(3,1.5,0,0,2,0,0,0) epsilon <- rnorm(100,sd=3) y <- X %*% beta0 + epsilon y <- c(y) #fit fit <- msgps(X,y) plot(fit,criterion="cp") #plot the solution path with a model selected by Cp criterion
This functions predicts fitted values via msgps function.
## S3 method for class 'msgps' predict(object, X, tuning,...) ## S3 method for class 'msgps' coef(object, tuning,...)
## S3 method for class 'msgps' predict(object, X, tuning,...) ## S3 method for class 'msgps' coef(object, tuning,...)
object |
Fitted |
X |
Matrix of vector of new input |
tuning |
Tuning parameter vector |
... |
Other parameters |
The object returned depends on type.
Kei Hirose
[email protected]
#data X <- matrix(rnorm(100*8),100,8) beta0 <- c(3,1.5,0,0,2,0,0,0) epsilon <- rnorm(100,sd=3) y <- X %*% beta0 + epsilon y <- c(y) #fit fit <- msgps(X,y) coef(fit) #extract coefficients at t selected by model selection criteria coef(fit,c(0, 0.5, 2.5)) #extract coefficients at some values of t predict(fit,X[1:10,]) #predict values at t selected by model selection criteria predict(fit,X[1:10,],c(0, 0.5, 2.5)) #predict values at some values of t
#data X <- matrix(rnorm(100*8),100,8) beta0 <- c(3,1.5,0,0,2,0,0,0) epsilon <- rnorm(100,sd=3) y <- X %*% beta0 + epsilon y <- c(y) #fit fit <- msgps(X,y) coef(fit) #extract coefficients at t selected by model selection criteria coef(fit,c(0, 0.5, 2.5)) #extract coefficients at some values of t predict(fit,X[1:10,]) #predict values at t selected by model selection criteria predict(fit,X[1:10,],c(0, 0.5, 2.5)) #predict values at some values of t
This functions summarizes the "msgps" object.
## S3 method for class 'msgps' summary(object, digits=max(3, getOption("digits") - 3), num.result = 20, coef.result=100,...)
## S3 method for class 'msgps' summary(object, digits=max(3, getOption("digits") - 3), num.result = 20, coef.result=100,...)
object |
Fitted |
digits |
The digits of the output. |
num.result |
The number of tuning parameter and the corresponding degrees of freedom displayed in this code. |
coef.result |
If the coef.result exceeds the number of variables, the result of coefficient is not described in this code. |
... |
Other parameters on summary |
df |
The degrees of freedom for each tuning parameter. |
tuning.max |
Maximum value of tuning parameter. |
ms.coef |
The coefficient selected by each model selection criterion. |
ms.tuning |
The values of tuning parameter of models selected by each model selection criterion. |
ms.df |
The degerees of freedom selected of models each model selection criterion. |
Kei Hirose
[email protected]
#data X <- matrix(rnorm(100*8),100,8) beta0 <- c(3,1.5,0,0,2,0,0,0) epsilon <- rnorm(100,sd=3) y <- X %*% beta0 + epsilon y <- c(y) #fit fit <- msgps(X,y) summary(fit)
#data X <- matrix(rnorm(100*8),100,8) beta0 <- c(3,1.5,0,0,2,0,0,0) epsilon <- rnorm(100,sd=3) y <- X %*% beta0 + epsilon y <- c(y) #fit fit <- msgps(X,y) summary(fit)