Skip to contents

Classification gradient boosting learner. Calls bst::bst() from bst.

Note

Only classification-appropriate loss functions are available for the family parameter.

Initial parameter values

  • Learner = "ls": Default base learner type

  • xval = 0: No cross-validation

  • maxdepth = 1: Maximum tree depth

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.bst")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “numeric”

  • Required Packages: mlr3, mlr3extralearners, bst, rpart

Parameters

IdTypeDefaultLevelsRange
centerlogicalFALSETRUE, FALSE-
coefiruntypedNULL-
costnumeric0.5\([0, 1]\)
cpnumeric0.01\([0, 1]\)
dfinteger4\([1, \infty)\)
familycharacterhingehinge, hinge2, binom, thingeDC, tbinomDC, binomdDC, loss, clossR, clossRMM, clossMM-
f.inituntypedNULL-
fkuntypedNULL-
interceptlogicalTRUETRUE, FALSE-
iterinteger1\([1, \infty)\)
Learnercharacterlsls, sm, tree-
maxdepthinteger1\([1, 30]\)
maxsurrogateinteger5\([0, \infty)\)
minbucketinteger-\([1, \infty)\)
minsplitinteger20\([1, \infty)\)
mstopinteger50\([1, \infty)\)
numsampleinteger50\([1, \infty)\)
nunumeric0.1\([0, 1]\)
qnumeric-\([0, 1]\)
qhnumeric-\([0, 1]\)
snumeric-\([0, \infty)\)
shnumeric-\([0, \infty)\)
startlogicalFALSETRUE, FALSE-
surrogatestyleinteger0\([0, 1]\)
thresholdcharacteradaptiveadaptive, fixed-
tracelogicalFALSETRUE, FALSE-
trunlogicalFALSETRUE, FALSE-
twinboostlogicalFALSETRUE, FALSE-
twintypeinteger1\([1, 2]\)
xselect.inituntypedNULL-
xvalinteger10\([0, \infty)\)

See also

Author

annanzrv

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifBst

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifBst$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = lrn("classif.bst")
print(learner)
#> 
#> ── <LearnerClassifBst> (classif.bst): Gradient Boosting ────────────────────────
#> • Model: -
#> • Parameters: Learner=ls, maxdepth=1, xval=0
#> • Packages: mlr3, mlr3extralearners, bst, and rpart
#> • Predict Types: [response] and prob
#> • Feature Types: numeric
#> • Encapsulation: none (fallback: -)
#> • Properties: twoclass
#> • Other settings: use_weights = 'error'

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> 
#> 	 Models Fitted with Gradient Boosting
#> 
#> Call:
#> bst::bst(x = data[, features, with = FALSE], y = data[[target]],     ctrl = ctrl, control.tree = ctrl_tree, learner = pars$Learner)
#> 
#> [1] "gaussian"
#> 
#> Base learner:  ls 
#> Number of boosting iterations: mstop = 50 
#> Step size:  0.1 
#> Offset:  0.03597122 
#> 
#> Coefficients: 
#>         V1        V10        V11        V12        V13        V14        V15 
#>  0.0000000  0.0000000  0.2030958  0.0000000  0.0000000  0.0000000  0.0000000 
#>        V16        V17        V18        V19         V2        V20        V21 
#>  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000 
#>        V22        V23        V24        V25        V26        V27        V28 
#>  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000 
#>        V29         V3        V30        V31        V32        V33        V34 
#>  0.0000000  0.0000000  0.0000000 -0.0413024  0.0000000  0.0000000  0.0000000 
#>        V35        V36        V37        V38        V39         V4        V40 
#>  0.0000000 -0.6140020  0.0000000  0.0000000  0.0000000  0.8660938  0.0000000 
#>        V41        V42        V43        V44        V45        V46        V47 
#>  0.0000000  0.0000000  0.0000000  0.0000000  0.7859369  0.0000000  0.0000000 
#>        V48        V49         V5        V50        V51        V52        V53 
#>  0.0000000  1.1591990  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000 
#>        V54        V55        V56        V57        V58        V59         V6 
#>  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000  0.0000000 
#>        V60         V7         V8         V9 
#>  0.0000000  0.0000000  0.0000000  0.0000000 
#> attr(,"offset")
#> [1] 0.03597122
#> 


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.2608696