Skip to contents

Gradient Boosting Classification Algorithm. Calls gbm::gbm() from gbm.

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.gbm")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, mlr3extralearners, gbm

Parameters

IdTypeDefaultLevelsRange
distributioncharacterbernoullibernoulli, adaboost, huberized, multinomial-
n.treesinteger100\([1, \infty)\)
interaction.depthinteger1\([1, \infty)\)
n.minobsinnodeinteger10\([1, \infty)\)
shrinkagenumeric0.001\([0, \infty)\)
bag.fractionnumeric0.5\([0, 1]\)
train.fractionnumeric1\([0, 1]\)
cv.foldsinteger0\((-\infty, \infty)\)
keep.datalogicalFALSETRUE, FALSE-
verboselogicalFALSETRUE, FALSE-
n.coresinteger1\((-\infty, \infty)\)
var.monotoneuntyped--

Initial parameter values

  • keep.data is initialized to FALSE to save memory.

  • n.cores is initialized to 1 to avoid conflicts with parallelization through future.

References

Friedman, H J (2002). “Stochastic gradient boosting.” Computational statistics & data analysis, 38(4), 367–378.

See also

Author

be-marc

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifGBM

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method importance()

The importance scores are extracted by gbm::relative.influence() from the model.

Usage

LearnerClassifGBM$importance()

Returns

Named numeric().


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifGBM$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = mlr3::lrn("classif.gbm")
print(learner)
#> <LearnerClassifGBM:classif.gbm>: Gradient Boosting
#> * Model: -
#> * Parameters: keep.data=FALSE, n.cores=1
#> * Packages: mlr3, mlr3extralearners, gbm
#> * Predict Types:  [response], prob
#> * Feature Types: integer, numeric, factor, ordered
#> * Properties: importance, missings, twoclass, weights

# Define a Task
task = mlr3::tsk("sonar")

# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
#> Distribution not specified, assuming bernoulli ...

print(learner$model)
#> gbm::gbm(formula = f, data = data, keep.data = FALSE, n.cores = 1L)
#> A gradient boosted model with bernoulli loss function.
#> 100 iterations were performed.
#> There were 60 predictors of which 43 had non-zero influence.
print(learner$importance())
#>        V12        V48        V36        V27        V45        V51        V16 
#> 16.7397716 11.8165545 10.6624316  7.9966398  7.7652813  6.7988762  6.0229066 
#>        V11        V31        V49        V13        V52        V40        V50 
#>  5.6529177  4.7801222  4.3438827  3.8044212  3.3517559  2.9061396  2.7726882 
#>         V8         V4        V37        V17         V9        V10        V57 
#>  2.6853690  2.6668349  2.6189212  2.4142035  2.0955166  1.8057338  1.7415331 
#>        V55        V21        V43        V28         V1        V39        V19 
#>  1.6741420  1.5020687  1.3687778  1.3683704  1.2865753  1.2225118  1.1211917 
#>        V15        V54        V23        V59        V29        V18        V14 
#>  1.1151965  0.8873069  0.8392636  0.7658783  0.7566078  0.6574688  0.5859203 
#>        V42        V22        V60        V58         V3        V44         V6 
#>  0.5542861  0.5319201  0.4895127  0.4870231  0.4562240  0.4210477  0.4127275 
#>        V20         V2        V24        V25        V26        V30        V32 
#>  0.3607090  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000 
#>        V33        V34        V35        V38        V41        V46        V47 
#>  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000 
#>         V5        V53        V56         V7 
#>  0.0000000  0.0000000  0.0000000  0.0000000 

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.1014493