Skip to contents

Gradient Boosting Classification Algorithm. Calls gbm::gbm() from gbm.

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.gbm")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, mlr3extralearners, gbm

Parameters

IdTypeDefaultLevelsRange
distributioncharacterbernoullibernoulli, adaboost, huberized, multinomial-
n.treesinteger100\([1, \infty)\)
interaction.depthinteger1\([1, \infty)\)
n.minobsinnodeinteger10\([1, \infty)\)
shrinkagenumeric0.001\([0, \infty)\)
bag.fractionnumeric0.5\([0, 1]\)
train.fractionnumeric1\([0, 1]\)
cv.foldsinteger0\((-\infty, \infty)\)
keep.datalogicalFALSETRUE, FALSE-
verboselogicalFALSETRUE, FALSE-
n.coresinteger1\((-\infty, \infty)\)
var.monotoneuntyped--

Initial parameter values

  • keep.data is initialized to FALSE to save memory.

  • n.cores is initialized to 1 to avoid conflicts with parallelization through future.

References

Friedman, H J (2002). “Stochastic gradient boosting.” Computational statistics & data analysis, 38(4), 367–378.

See also

Author

be-marc

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifGBM

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method importance()

The importance scores are extracted by gbm::relative.influence() from the model.

Usage

LearnerClassifGBM$importance()

Returns

Named numeric().


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifGBM$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = lrn("classif.gbm")
print(learner)
#> 
#> ── <LearnerClassifGBM> (classif.gbm): Gradient Boosting ────────────────────────
#> • Model: -
#> • Parameters: keep.data=FALSE, n.cores=1
#> • Packages: mlr3, mlr3extralearners, and gbm
#> • Predict Types: [response] and prob
#> • Feature Types: integer, numeric, factor, and ordered
#> • Encapsulation: none (fallback: -)
#> • Properties: importance, missings, twoclass, and weights
#> • Other settings: use_weights = 'use'

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
#> Distribution not specified, assuming bernoulli ...

print(learner$model)
#> gbm::gbm(formula = f, data = data, keep.data = FALSE, n.cores = 1L)
#> A gradient boosted model with bernoulli loss function.
#> 100 iterations were performed.
#> There were 60 predictors of which 43 had non-zero influence.
print(learner$importance())
#>        V11        V49        V51        V12         V9        V28        V52 
#> 18.8330130  8.2492767  8.0056954  7.9432729  6.7662709  6.1958739  5.7695880 
#>        V46        V23        V45        V39        V27        V13        V36 
#>  5.5641673  5.4425149  4.6191125  3.9993273  3.7629493  3.6652325  3.3487356 
#>        V21        V37        V24        V48        V20         V4        V15 
#>  3.0851064  2.8577834  2.2280447  2.0740328  1.8931009  1.7973259  1.7747578 
#>         V1        V47        V44        V32        V58        V54        V30 
#>  1.5827274  1.4285331  1.3621758  1.3296313  1.1319884  1.1289674  1.1077087 
#>        V16        V29        V22         V7        V57        V34        V26 
#>  1.0174672  0.9788610  0.9670330  0.8488506  0.8086092  0.6616984  0.6117263 
#>         V5        V10        V19        V59        V33        V25        V17 
#>  0.5809424  0.5628604  0.4512116  0.4311000  0.4178590  0.4010263  0.3407965 
#>         V8        V14        V18         V2         V3        V31        V35 
#>  0.3378300  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000 
#>        V38        V40        V41        V42        V43        V50        V53 
#>  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000 
#>        V55        V56         V6        V60 
#>  0.0000000  0.0000000  0.0000000  0.0000000 

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.2463768