Skip to contents

Gradient Boosting Classification Algorithm. Calls gbm::gbm() from gbm.

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.gbm")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, mlr3extralearners, gbm

Parameters

IdTypeDefaultLevelsRange
distributioncharacterbernoullibernoulli, adaboost, huberized, multinomial-
n.treesinteger100\([1, \infty)\)
interaction.depthinteger1\([1, \infty)\)
n.minobsinnodeinteger10\([1, \infty)\)
shrinkagenumeric0.001\([0, \infty)\)
bag.fractionnumeric0.5\([0, 1]\)
train.fractionnumeric1\([0, 1]\)
cv.foldsinteger0\((-\infty, \infty)\)
keep.datalogicalFALSETRUE, FALSE-
verboselogicalFALSETRUE, FALSE-
n.coresinteger1\((-\infty, \infty)\)
var.monotoneuntyped--

Initial parameter values

  • keep.data is initialized to FALSE to save memory.

  • n.cores is initialized to 1 to avoid conflicts with parallelization through future.

References

Friedman, H J (2002). “Stochastic gradient boosting.” Computational statistics & data analysis, 38(4), 367–378.

See also

Author

be-marc

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifGBM

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method importance()

The importance scores are extracted by gbm::relative.influence() from the model.

Usage

LearnerClassifGBM$importance()

Returns

Named numeric().


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifGBM$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = lrn("classif.gbm")
print(learner)
#> 
#> ── <LearnerClassifGBM> (classif.gbm): Gradient Boosting ────────────────────────
#> • Model: -
#> • Parameters: keep.data=FALSE, n.cores=1
#> • Packages: mlr3, mlr3extralearners, and gbm
#> • Predict Types: [response] and prob
#> • Feature Types: integer, numeric, factor, and ordered
#> • Encapsulation: none (fallback: -)
#> • Properties: importance, missings, twoclass, and weights
#> • Other settings: use_weights = 'use'

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
#> Distribution not specified, assuming bernoulli ...

print(learner$model)
#> gbm::gbm(formula = f, data = data, keep.data = FALSE, n.cores = 1L)
#> A gradient boosted model with bernoulli loss function.
#> 100 iterations were performed.
#> There were 60 predictors of which 43 had non-zero influence.
print(learner$importance())
#>        V10        V11        V48        V52        V12        V57        V36 
#> 19.7097062 12.8056587  8.3553172  7.4724039  7.2462522  5.8946235  5.5098608 
#>        V45         V4        V21         V8        V47        V46        V13 
#>  5.1627623  4.3492610  4.0596331  3.9137548  3.7085400  3.4124520  3.3633818 
#>         V5        V26        V22        V55        V27        V15        V23 
#>  3.0059740  2.8983465  2.5760175  2.3334755  2.1112127  2.0823628  1.8071046 
#>        V16        V51         V1         V9        V20        V49         V7 
#>  1.7039959  1.6850348  1.4105187  1.3994288  1.3548494  1.2970369  1.1756207 
#>        V17        V60        V28        V35        V37        V31        V44 
#>  1.1077672  1.0223043  0.9670103  0.9542359  0.9522755  0.8738898  0.7883649 
#>        V24         V3        V58        V32        V43        V54        V25 
#>  0.7594512  0.7026979  0.6187861  0.5863976  0.5277795  0.5107163  0.4428077 
#>        V30        V14        V18        V19         V2        V29        V33 
#>  0.3893974  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000 
#>        V34        V38        V39        V40        V41        V42        V50 
#>  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000 
#>        V53        V56        V59         V6 
#>  0.0000000  0.0000000  0.0000000  0.0000000 

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.2173913