Skip to contents

Classification gradient boosting learner. Calls bst::bst() from bst.

Note

Only classification-appropriate loss functions are available for the family parameter.

Initial parameter values

  • Learner = "ls": Default base learner type

  • xval = 0: No cross-validation

  • maxdepth = 1: Maximum tree depth

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.bst")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “numeric”

  • Required Packages: mlr3, mlr3extralearners, bst, rpart

Parameters

IdTypeDefaultLevelsRange
centerlogicalFALSETRUE, FALSE-
coefiruntypedNULL-
costnumeric0.5\([0, 1]\)
cpnumeric0.01\([0, 1]\)
dfinteger4\([1, \infty)\)
familycharacterhingehinge, hinge2, binom, thingeDC, tbinomDC, binomdDC, loss, clossR, clossRMM, clossMM-
f.inituntypedNULL-
fkuntypedNULL-
interceptlogicalTRUETRUE, FALSE-
iterinteger1\([1, \infty)\)
Learnercharacterlsls, sm, tree-
maxdepthinteger1\([1, 30]\)
maxsurrogateinteger5\([0, \infty)\)
minbucketinteger-\([1, \infty)\)
minsplitinteger20\([1, \infty)\)
mstopinteger50\([1, \infty)\)
numsampleinteger50\([1, \infty)\)
nunumeric0.1\([0, 1]\)
qnumeric-\([0, 1]\)
qhnumeric-\([0, 1]\)
snumeric-\([0, \infty)\)
shnumeric-\([0, \infty)\)
startlogicalFALSETRUE, FALSE-
surrogatestyleinteger0\([0, 1]\)
thresholdcharacteradaptiveadaptive, fixed-
tracelogicalFALSETRUE, FALSE-
trunlogicalFALSETRUE, FALSE-
twinboostlogicalFALSETRUE, FALSE-
twintypeinteger1\([1, 2]\)
xselect.inituntypedNULL-
xvalinteger10\([0, \infty)\)

See also

Author

annanzrv

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifBst

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifBst$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = lrn("classif.bst")
print(learner)
#> 
#> ── <LearnerClassifBst> (classif.bst): Gradient Boosting ────────────────────────
#> • Model: -
#> • Parameters: Learner=ls, maxdepth=1, xval=0
#> • Packages: mlr3, mlr3extralearners, bst, and rpart
#> • Predict Types: [response] and prob
#> • Feature Types: numeric
#> • Encapsulation: none (fallback: -)
#> • Properties: twoclass
#> • Other settings: use_weights = 'error'

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> 
#> 	 Models Fitted with Gradient Boosting
#> 
#> Call:
#> bst::bst(x = data[, features, with = FALSE], y = data[[target]],     ctrl = ctrl, control.tree = ctrl_tree, learner = pars$Learner)
#> 
#> [1] "gaussian"
#> 
#> Base learner:  ls 
#> Number of boosting iterations: mstop = 50 
#> Step size:  0.1 
#> Offset:  0.1079137 
#> 
#> Coefficients: 
#>          V1         V10         V11         V12         V13         V14 
#>  0.00000000  0.00000000  0.21302619  0.00000000  0.00000000  0.00000000 
#>         V15         V16         V17         V18         V19          V2 
#>  0.00000000  0.00000000  0.00000000  0.00000000  0.00000000  0.00000000 
#>         V20         V21         V22         V23         V24         V25 
#>  0.00000000  0.00000000  0.00000000  0.00000000  0.00000000  0.00000000 
#>         V26         V27         V28         V29          V3         V30 
#>  0.00000000  0.00000000  0.00000000  0.00000000  0.00000000  0.00000000 
#>         V31         V32         V33         V34         V35         V36 
#> -0.01902207  0.00000000  0.00000000  0.00000000  0.00000000 -0.64652471 
#>         V37         V38         V39          V4         V40         V41 
#>  0.00000000  0.00000000  0.00000000  0.71946154  0.00000000  0.00000000 
#>         V42         V43         V44         V45         V46         V47 
#>  0.00000000  0.00000000  0.00000000  0.81240595  0.00000000  0.00000000 
#>         V48         V49          V5         V50         V51         V52 
#>  0.00000000  1.18970252  0.00000000  0.00000000  0.00000000  0.00000000 
#>         V53         V54         V55         V56         V57         V58 
#>  0.00000000  0.00000000  0.00000000  0.00000000  0.00000000  0.00000000 
#>         V59          V6         V60          V7          V8          V9 
#>  0.00000000  0.00000000  0.00000000  0.00000000  0.00000000  0.00000000 
#> attr(,"offset")
#> [1] 0.1079137
#> 


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.2898551