Skip to contents

Bayesian Additive Regression Trees are similar to gradient boosting algorithms. The classification problem is solved by 0-1 encoding of the two-class targets and setting the decision threshold to p = 0.5 during the prediction phase. Calls dbarts::bart() from dbarts.

Dictionary

This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn():

mlr_learners$get("classif.bart")
lrn("classif.bart")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, mlr3extralearners, dbarts

Parameters

IdTypeDefaultLevelsRange
ntreeinteger200\([1, \infty)\)
knumeric2\([0, \infty)\)
powernumeric2\([0, \infty)\)
basenumeric0.95\([0, 1]\)
binaryOffsetnumeric0\((-\infty, \infty)\)
ndpostinteger1000\([1, \infty)\)
nskipinteger100\([0, \infty)\)
printeveryinteger100\([0, \infty)\)
keepeveryinteger1\([1, \infty)\)
keeptrainfitslogicalTRUETRUE, FALSE-
usequantslogicalFALSETRUE, FALSE-
numcutinteger100\([1, \infty)\)
printcutoffsinteger0\((-\infty, \infty)\)
verboselogicalFALSETRUE, FALSE-
nthreadinteger1\((-\infty, \infty)\)
keepcalllogicalTRUETRUE, FALSE-
sampleronlylogicalFALSETRUE, FALSE-
seedintegerNA\((-\infty, \infty)\)
proposalprobsuntypedNULL-
splitprobsuntypedNULL-
keepsamplerlogical-TRUE, FALSE-

Parameter Changes

  • Parameter: keeptrees

  • Original: FALSE

  • New: TRUE

  • Reason: Required for prediction

  • Parameter: offset

  • The parameter is removed, because only dbarts::bart2 allows an offset during training, and therefore the offset parameter in dbarts:::predict.bart is irrelevant for dbarts::dbart.

  • Parameter: nchain, combineChains, combinechains

  • The parameters are removed as parallelization of multiple models is handled by future.

  • Parameter: sigest, sigdf, sigquant, keeptres

  • Regression only.

References

Sparapani, Rodney, Spanbauer, Charles, McCulloch, Robert (2021). “Nonparametric machine learning and efficient computation with bayesian additive regression trees: the BART R package.” Journal of Statistical Software, 97, 1--66.

Chipman, A H, George, I E, McCulloch, E R (2010). “BART: Bayesian additive regression trees.” The Annals of Applied Statistics, 4(1), 266--298.

See also

Author

ck37

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifBart

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifBart$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

learner = mlr3::lrn("classif.bart")
print(learner)
#> <LearnerClassifBart:classif.bart>: Bayesian Additive Regression Trees
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, mlr3extralearners, dbarts
#> * Predict Types:  [response], prob
#> * Feature Types: integer, numeric, factor, ordered
#> * Properties: twoclass, weights

# available parameters:
learner$param_set$ids()
#>  [1] "ntree"         "k"             "power"         "base"         
#>  [5] "binaryOffset"  "ndpost"        "nskip"         "printevery"   
#>  [9] "keepevery"     "keeptrainfits" "usequants"     "numcut"       
#> [13] "printcutoffs"  "verbose"       "nthread"       "keepcall"     
#> [17] "sampleronly"   "seed"          "proposalprobs" "splitprobs"   
#> [21] "keepsampler"