Regression BART (Bayesian Additive Regression Trees) Learner
mlr_learners_regr.bart.Rd
Bayesian Additive Regression Trees are similar to gradient boosting algorithms.
Calls dbarts::bart()
from dbarts.
Dictionary
This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn()
:
$get("regr.bart")
mlr_learnerslrn("regr.bart")
Meta Information
Task type: “regr”
Predict Types: “response”
Feature Types: “integer”, “numeric”, “factor”, “ordered”
Required Packages: mlr3, mlr3extralearners, dbarts
Parameters
Id | Type | Default | Levels | Range |
ntree | integer | 200 | \([1, \infty)\) | |
sigest | untyped | - | ||
sigdf | integer | 3 | \([1, \infty)\) | |
sigquant | numeric | 0.9 | \([0, 1]\) | |
k | numeric | 2 | \([0, \infty)\) | |
power | numeric | 2 | \([0, \infty)\) | |
base | numeric | 0.95 | \([0, 1]\) | |
ndpost | integer | 1000 | \([1, \infty)\) | |
nskip | integer | 100 | \([0, \infty)\) | |
printevery | integer | 100 | \([0, \infty)\) | |
keepevery | integer | 1 | \([1, \infty)\) | |
keeptrainfits | logical | TRUE | TRUE, FALSE | - |
usequants | logical | FALSE | TRUE, FALSE | - |
numcut | integer | 100 | \([1, \infty)\) | |
printcutoffs | integer | 0 | \((-\infty, \infty)\) | |
verbose | logical | FALSE | TRUE, FALSE | - |
nthread | integer | 1 | \((-\infty, \infty)\) | |
keeptrees | logical | FALSE | TRUE, FALSE | - |
keepcall | logical | TRUE | TRUE, FALSE | - |
sampleronly | logical | FALSE | TRUE, FALSE | - |
seed | integer | NA | \((-\infty, \infty)\) | |
proposalprobs | untyped | - | ||
splitprobs | untyped | - | ||
keepsampler | logical | - | TRUE, FALSE | - |
Custom mlr3 parameters
Parameter: offset
The parameter is removed, because only
dbarts::bart2
allows an offset during training, and therefore the offset parameter indbarts:::predict.bart
is irrelevant fordbarts::dbart
.
Parameter: nchain, combineChains, combinechains
The parameters are removed as parallelization of multiple models is handled by future.
References
Sparapani, Rodney, Spanbauer, Charles, McCulloch, Robert (2021). “Nonparametric machine learning and efficient computation with bayesian additive regression trees: the BART R package.” Journal of Statistical Software, 97, 1--66.
Chipman, A H, George, I E, McCulloch, E R (2010). “BART: Bayesian additive regression trees.” The Annals of Applied Statistics, 4(1), 266--298.
See also
as.data.table(mlr_learners)
for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/basics.html#learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner
-> mlr3::LearnerRegr
-> LearnerRegrBart
Examples
learner = mlr3::lrn("regr.bart")
print(learner)
#> <LearnerRegrBart:regr.bart>: Bayesian Additive Regression Trees
#> * Model: -
#> * Parameters: keeptrees=TRUE
#> * Packages: mlr3, mlr3extralearners, dbarts
#> * Predict Types: [response]
#> * Feature Types: integer, numeric, factor, ordered
#> * Properties: weights
# available parameters:
learner$param_set$ids()
#> [1] "ntree" "sigest" "sigdf" "sigquant"
#> [5] "k" "power" "base" "ndpost"
#> [9] "nskip" "printevery" "keepevery" "keeptrainfits"
#> [13] "usequants" "numcut" "printcutoffs" "verbose"
#> [17] "nthread" "keeptrees" "keepcall" "sampleronly"
#> [21] "seed" "proposalprobs" "splitprobs" "keepsampler"