Skip to contents

Fit a generalized additive regression model using a boosting algorithm. Calls mboost::gamboost() from mboost.

Dictionary

This Learner can be instantiated via lrn():

lrn("regr.gamboost")

Meta Information

  • Task type: “regr”

  • Predict Types: “response”

  • Feature Types: “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, mlr3extralearners, mboost

Parameters

IdTypeDefaultLevelsRange
baselearnercharacterbbsbbs, bols, btree-
dfbaseinteger4\((-\infty, \infty)\)
offsetnumericNULL\((-\infty, \infty)\)
familycharacterGaussianGaussian, Laplace, Huber, Poisson, GammaReg, NBinomial, Hurdle, custom-
custom.familyuntyped--
nuirangeuntypedc(0, 100)-
dnumericNULL\((-\infty, \infty)\)
mstopinteger100\((-\infty, \infty)\)
nunumeric0.1\((-\infty, \infty)\)
riskcharacterinbaginbag, oobag, none-
oobweightsuntypedNULL-
tracelogicalFALSETRUE, FALSE-
stopinternuntypedFALSE-
na.actionuntypedstats::na.omit-

References

Bühlmann, Peter, Yu, Bin (2003). “Boosting with the L 2 loss: regression and classification.” Journal of the American Statistical Association, 98(462), 324–339.

See also

Author

be-marc

Super classes

mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrGAMBoost

Methods

Inherited methods


Method new()

Create a LearnerRegrGAMBoost object.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerRegrGAMBoost$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = lrn("regr.gamboost", baselearner = "bols")
print(learner)
#> <LearnerRegrGAMBoost:regr.gamboost>: Boosted Generalized Additive Model
#> * Model: -
#> * Parameters: baselearner=bols
#> * Packages: mlr3, mlr3extralearners, mboost
#> * Predict Types:  [response]
#> * Feature Types: integer, numeric, factor, ordered
#> * Properties: weights