Skip to contents

Generalized linear models with elastic net regularization. Calls glmnet::glmnet() from package glmnet.

Details

This learner returns two prediction types:

  1. lp: a vector of linear predictors (relative risk scores), one per observation. Calculated using glmnet::predict.coxnet().

  2. distr: a survival matrix in two dimensions, where observations are represented in rows and time points in columns. Calculated using glmnet::survfit.coxnet(). Parameters stype and ctype relate to how lp predictions are transformed into survival predictions and are described in survival::survfit.coxph(). By default the Breslow estimator is used.

Caution: This learner is different to learners calling glmnet::cv.glmnet() in that it does not use the internal optimization of parameter lambda. Instead, lambda needs to be tuned by the user (e.g., via mlr3tuning). When lambda is tuned, the glmnet will be trained for each tuning iteration. While fitting the whole path of lambdas would be more efficient, as is done by default in glmnet::glmnet(), tuning/selecting the parameter at prediction time (using parameter s) is currently not supported in mlr3 (at least not in efficient manner). Tuning the s parameter is, therefore, currently discouraged.

When the data are i.i.d. and efficiency is key, we recommend using the respective auto-tuning counterpart in mlr_learners_surv.cv_glmnet(). However, in some situations this is not applicable, usually when data are imbalanced or not i.i.d. (longitudinal, time-series) and tuning requires custom resampling strategies (blocked design, stratification).

Custom mlr3 parameters

  • family is set to "cox" and cannot be changed.

Dictionary

This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn():

mlr_learners$get("surv.glmnet")
lrn("surv.glmnet")

Meta Information

  • Task type: “surv”

  • Predict Types: “crank”, “distr”, “lp”

  • Feature Types: “logical”, “integer”, “numeric”

  • Required Packages: mlr3, mlr3proba, mlr3extralearners, glmnet

Parameters

IdTypeDefaultLevelsRange
alignmentcharacterlambdalambda, fraction-
alphanumeric1\([0, 1]\)
bignumeric9.9e+35\((-\infty, \infty)\)
devmaxnumeric0.999\([0, 1]\)
dfmaxinteger-\([0, \infty)\)
epsnumeric1e-06\([0, 1]\)
epsnrnumeric1e-08\([0, 1]\)
exactlogicalFALSETRUE, FALSE-
excludeuntyped--
exmxnumeric250\((-\infty, \infty)\)
fdevnumeric1e-05\([0, 1]\)
gammauntyped--
groupedlogicalTRUETRUE, FALSE-
interceptlogicalTRUETRUE, FALSE-
keeplogicalFALSETRUE, FALSE-
lambdauntyped--
lambda.min.rationumeric-\([0, 1]\)
lower.limitsuntyped-Inf-
maxitinteger100000\([1, \infty)\)
mnlaminteger5\([1, \infty)\)
mxitinteger100\([1, \infty)\)
mxitnrinteger25\([1, \infty)\)
newoffsetuntyped--
nlambdainteger100\([1, \infty)\)
offsetuntypedNULL-
parallellogicalFALSETRUE, FALSE-
penalty.factoruntyped--
pmaxinteger-\([0, \infty)\)
pminnumeric1e-09\([0, 1]\)
precnumeric1e-10\((-\infty, \infty)\)
predict.gammanumericgamma.1se\((-\infty, \infty)\)
relaxlogicalFALSETRUE, FALSE-
snumeric0.01\([0, \infty)\)
standardizelogicalTRUETRUE, FALSE-
threshnumeric1e-07\([0, \infty)\)
trace.itinteger0\([0, 1]\)
type.logisticcharacterNewtonNewton, modified.Newton-
type.multinomialcharacterungroupedungrouped, grouped-
upper.limitsuntypedInf-
stypeinteger2\([1, 2]\)
ctypeinteger-\([1, 2]\)

References

Friedman J, Hastie T, Tibshirani R (2010). “Regularization Paths for Generalized Linear Models via Coordinate Descent.” Journal of Statistical Software, 33(1), 1--22. doi:10.18637/jss.v033.i01 .

See also

Author

be-marc

Super classes

mlr3::Learner -> mlr3proba::LearnerSurv -> LearnerSurvGlmnet

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method selected_features()

Returns the set of selected features as reported by glmnet::predict.glmnet() with type set to "nonzero".

Usage

LearnerSurvGlmnet$selected_features(lambda = NULL)

Arguments

lambda

(numeric(1))
Custom lambda, defaults to the active lambda depending on parameter set.

Returns

(character()) of feature names.


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerSurvGlmnet$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

learner = mlr3::lrn("surv.glmnet")
print(learner)
#> <LearnerSurvGlmnet:surv.glmnet>: Regularized Generalized Linear Model
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, mlr3proba, mlr3extralearners, glmnet
#> * Predict Types:  [crank], distr, lp
#> * Feature Types: logical, integer, numeric
#> * Properties: selected_features, weights

# available parameters:
learner$param_set$ids()
#>  [1] "alignment"        "alpha"            "big"              "devmax"          
#>  [5] "dfmax"            "eps"              "epsnr"            "exact"           
#>  [9] "exclude"          "exmx"             "fdev"             "gamma"           
#> [13] "grouped"          "intercept"        "keep"             "lambda"          
#> [17] "lambda.min.ratio" "lower.limits"     "maxit"            "mnlam"           
#> [21] "mxit"             "mxitnr"           "newoffset"        "nlambda"         
#> [25] "offset"           "parallel"         "penalty.factor"   "pmax"            
#> [29] "pmin"             "prec"             "predict.gamma"    "relax"           
#> [33] "s"                "standardize"      "thresh"           "trace.it"        
#> [37] "type.logistic"    "type.multinomial" "upper.limits"     "stype"           
#> [41] "ctype"