Skip to contents

Regression gradient boosting learner. Calls bst::bst() from bst.

Initial parameter values

  • Learner = "ls": Default base learner type

  • xval = 0: No cross-validation

  • maxdepth = 1: Maximum tree depth

Dictionary

This Learner can be instantiated via lrn():

lrn("regr.bst")

Meta Information

Parameters

IdTypeDefaultLevelsRange
centerlogicalFALSETRUE, FALSE-
coefiruntypedNULL-
costnumeric0.5\([0, 1]\)
cpnumeric0.01\([0, 1]\)
dfinteger4\([1, \infty)\)
familycharactergaussiangaussian, laplace, huber, rhuberDC, thingeDC, tbinomDC, binomdDC-
f.inituntypedNULL-
fkuntypedNULL-
interceptlogicalTRUETRUE, FALSE-
iterinteger1\([1, \infty)\)
Learnercharacterlsls, sm, tree-
maxdepthinteger1\([1, 30]\)
maxsurrogateinteger5\([0, \infty)\)
minbucketinteger-\([1, \infty)\)
minsplitinteger20\([1, \infty)\)
mstopinteger50\([1, \infty)\)
numsampleinteger50\([1, \infty)\)
nunumeric0.1\([0, 1]\)
qnumeric-\([0, 1]\)
qhnumeric-\([0, 1]\)
snumeric-\([0, \infty)\)
shnumeric-\([0, \infty)\)
startlogicalFALSETRUE, FALSE-
surrogatestyleinteger0\([0, 1]\)
thresholdcharacteradaptiveadaptive, fixed-
tracelogicalFALSETRUE, FALSE-
trunlogicalFALSETRUE, FALSE-
twinboostlogicalFALSETRUE, FALSE-
twintypeinteger1\([1, 2]\)
xselect.inituntypedNULL-
xvalinteger10\([0, \infty)\)

See also

Author

annanzrv

Super classes

mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrBst

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerRegrBst$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = lrn("regr.bst")
print(learner)
#> 
#> ── <LearnerRegrBst> (regr.bst): Gradient Boosting ──────────────────────────────
#> • Model: -
#> • Parameters: Learner=ls, maxdepth=1, xval=0
#> • Packages: mlr3, mlr3extralearners, bst, and rpart
#> • Predict Types: [response]
#> • Feature Types: numeric
#> • Encapsulation: none (fallback: -)
#> • Properties:
#> • Other settings: use_weights = 'error'

# Define a Task
task = tsk("mtcars")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> 
#> 	 Models Fitted with Gradient Boosting
#> 
#> Call:
#> bst::bst(x = data[, features, with = FALSE], y = data[[target]],     ctrl = ctrl, control.tree = ctrl_tree, learner = pars$Learner)
#> 
#> [1] "gaussian"
#> 
#> Base learner:  ls 
#> Number of boosting iterations: mstop = 50 
#> Step size:  0.1 
#> Offset:  20.3619 
#> 
#> Coefficients: 
#>           am         carb          cyl         disp         drat         gear 
#>  3.373246439 -0.298168952  0.000000000 -0.009836173  0.000000000  0.000000000 
#>           hp         qsec           vs           wt 
#>  0.000000000  0.000000000  4.570354486  0.000000000 
#> attr(,"offset")
#> [1] 20.3619
#> 


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> regr.mse 
#> 7.544578