Skip to contents

Random forest for regression. Calls randomForest::randomForest() from randomForest.

Dictionary

This Learner can be instantiated via lrn():

lrn("regr.randomForest")

Meta Information

  • Task type: “regr”

  • Predict Types: “response”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, mlr3extralearners, randomForest

Parameters

IdTypeDefaultLevelsRange
ntreeinteger500\([1, \infty)\)
mtryinteger-\([1, \infty)\)
replacelogicalTRUETRUE, FALSE-
stratauntyped--
sampsizeuntyped--
nodesizeinteger5\([1, \infty)\)
maxnodesinteger-\([1, \infty)\)
importancecharacterFALSEmse, nudepurity, none-
localImplogicalFALSETRUE, FALSE-
proximitylogicalFALSETRUE, FALSE-
oob.proxlogical-TRUE, FALSE-
norm.voteslogicalTRUETRUE, FALSE-
do.tracelogicalFALSETRUE, FALSE-
keep.forestlogicalTRUETRUE, FALSE-
keep.inbaglogicalFALSETRUE, FALSE-
predict.alllogicalFALSETRUE, FALSE-
nodeslogicalFALSETRUE, FALSE-

References

Breiman, Leo (2001). “Random Forests.” Machine Learning, 45(1), 5–32. ISSN 1573-0565, doi:10.1023/A:1010933404324 .

See also

Author

pat-s

Super classes

mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrRandomForest

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method importance()

The importance scores are extracted from the slot importance. Parameter 'importance' must be set to either "mse" or "nodepurity".

Usage

LearnerRegrRandomForest$importance()

Returns

Named numeric().


Method oob_error()

OOB errors are extracted from the model slot mse.

Usage

LearnerRegrRandomForest$oob_error()

Returns

numeric(1).


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerRegrRandomForest$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = mlr3::lrn("regr.randomForest", importance = "mse")
print(learner)
#> <LearnerRegrRandomForest:regr.randomForest>: Random Forest
#> * Model: -
#> * Parameters: importance=mse
#> * Packages: mlr3, mlr3extralearners, randomForest
#> * Predict Types:  [response]
#> * Feature Types: logical, integer, numeric, factor, ordered
#> * Properties: importance, oob_error, weights

# Define a Task
task = mlr3::tsk("mtcars")
# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> 
#> Call:
#>  randomForest(formula = formula, data = data, importance = TRUE) 
#>                Type of random forest: regression
#>                      Number of trees: 500
#> No. of variables tried at each split: 3
#> 
#>           Mean of squared residuals: 6.883324
#>                     % Var explained: 71.56
print(learner$importance())
#>         disp          cyl           wt           hp           vs         carb 
#>  6.025737416  5.416409006  4.510035699  4.097977334  1.105169787  1.055724134 
#>         qsec         gear           am         drat 
#>  0.655218498  0.205851019  0.071407624 -0.005183683 

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> regr.mse 
#>  10.9549