Skip to contents

Bayesian treed linear model regression. Calls tgp::btlm() from tgp.

Factor features are one-hot encoded with reference encoding before fitting. If factors are present, basemax is set to the number of non-factor features so that tree proposals account for the numeric part of the design.

Initial parameter values

  • verb is initialized to 0 to silence printing.

  • pred.n is initialized to FALSE to skip prediction during training.

Dictionary

This Learner can be instantiated via lrn():

lrn("regr.btlm")

Meta Information

  • Task type: “regr”

  • Predict Types: “response”, “se”

  • Feature Types: “integer”, “numeric”, “factor”

  • Required Packages: mlr3, mlr3extralearners, tgp

Parameters

IdTypeDefaultLevelsRange
bpriorcharacterbflatb0, b0not, bflat, bmle, bmznot, bmzt-
BTEuntypedc(2000L, 7000L, 2L)-
Ds2xlogicalFALSETRUE, FALSE-
improvlogicalFALSETRUE, FALSE-
itempsuntypedNULL-
krigelogicalTRUETRUE, FALSE-
m0r1logicalTRUETRUE, FALSE-
meanfncharacterlinearconstant, linear-
pred.nlogical-TRUE, FALSE-
Rinteger1\([1, \infty)\)
tracelogicalFALSETRUE, FALSE-
treeuntypedc(0.5, 2)-
verbinteger-\([0, 4]\)
zcovlogicalFALSETRUE, FALSE-

References

Gramacy RB (2007). “tgp: An R Package for Bayesian Nonstationary, Semiparametric Nonlinear Regression and Design by Treed Gaussian Process Models.” Journal of Statistical Software, 19(9), 1–46. doi:10.18637/jss.v019.i09 .

Gramacy RB, Taddy M (2010). “Categorical Inputs, Sensitivity Analysis, Optimization and Importance Tempering with tgp Version 2, an R Package for Treed Gaussian Process Models.” Journal of Statistical Software, 33(6), 1–48. doi:10.18637/jss.v033.i06 .

See also

Author

awinterstetter

Super classes

mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrBtlm

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerRegrBtlm$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = lrn("regr.btlm")
print(learner)
#> 
#> ── <LearnerRegrBtlm> (regr.btlm): Bayesian Treed Linear Model ──────────────────
#> • Model: -
#> • Parameters: pred.n=FALSE, verb=0
#> • Packages: mlr3, mlr3extralearners, and tgp
#> • Predict Types: [response] and se
#> • Feature Types: integer, numeric, and factor
#> • Encapsulation: none (fallback: -)
#> • Properties:
#> • Other settings: use_weights = 'error'

# Define a Task
task = tsk("mtcars")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> $model
#> 
#> This is a 'tgp' class object.  
#> It is basically a list with the following entries:
#> 
#>  [1] X        n        d        Z        nn       Xsplit   BTE      R       
#>  [9] linburn  g        dparams  itemps   bimprov  ess      gpcs     response
#> [17] improv   parts    trees    posts    params   m0r1    
#> 
#> See ?btgp for an explanation of the individual entries.  
#> See plot.tgp and tgp.trees for help with visualization.
#> 
#> The $trace field, if it exists, is of class 'tgptraces' 
#> and has its own print statement
#> 
#> 
#> $factor_levels
#> list()
#> 
#> $column_names
#>  [1] "am"   "carb" "cyl"  "disp" "drat" "gear" "hp"   "qsec" "vs"   "wt"  
#> 


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> regr.mse 
#> 31.32846