Skip to contents

Random forest for regression. Calls randomForest::randomForest() from randomForest.

Dictionary

This Learner can be instantiated via lrn():

lrn("regr.randomForest")

Meta Information

  • Task type: “regr”

  • Predict Types: “response”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, mlr3extralearners, randomForest

Parameters

IdTypeDefaultLevelsRange
ntreeinteger500\([1, \infty)\)
mtryinteger-\([1, \infty)\)
replacelogicalTRUETRUE, FALSE-
stratauntyped--
sampsizeuntyped--
nodesizeinteger5\([1, \infty)\)
maxnodesinteger-\([1, \infty)\)
importancecharacterFALSEmse, nudepurity, none-
localImplogicalFALSETRUE, FALSE-
proximitylogicalFALSETRUE, FALSE-
oob.proxlogical-TRUE, FALSE-
norm.voteslogicalTRUETRUE, FALSE-
do.tracelogicalFALSETRUE, FALSE-
keep.forestlogicalTRUETRUE, FALSE-
keep.inbaglogicalFALSETRUE, FALSE-
predict.alllogicalFALSETRUE, FALSE-
nodeslogicalFALSETRUE, FALSE-

References

Breiman, Leo (2001). “Random Forests.” Machine Learning, 45(1), 5–32. ISSN 1573-0565, doi:10.1023/A:1010933404324 .

See also

Author

pat-s

Super classes

mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrRandomForest

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method importance()

The importance scores are extracted from the slot importance. Parameter 'importance' must be set to either "mse" or "nodepurity".

Usage

LearnerRegrRandomForest$importance()

Returns

Named numeric().


Method oob_error()

OOB errors are extracted from the model slot mse.

Usage

LearnerRegrRandomForest$oob_error()

Returns

numeric(1).


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerRegrRandomForest$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = mlr3::lrn("regr.randomForest", importance = "mse")
print(learner)
#> <LearnerRegrRandomForest:regr.randomForest>: Random Forest
#> * Model: -
#> * Parameters: importance=mse
#> * Packages: mlr3, mlr3extralearners, randomForest
#> * Predict Types:  [response]
#> * Feature Types: logical, integer, numeric, factor, ordered
#> * Properties: importance, oob_error, weights

# Define a Task
task = mlr3::tsk("mtcars")
# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> 
#> Call:
#>  randomForest(formula = formula, data = data, importance = TRUE) 
#>                Type of random forest: regression
#>                      Number of trees: 500
#> No. of variables tried at each split: 3
#> 
#>           Mean of squared residuals: 7.698618
#>                     % Var explained: 76.87
print(learner$importance())
#>        wt        hp      disp       cyl      drat        vs      qsec      gear 
#> 9.8382158 7.6920089 6.3367211 4.7241004 3.3964722 0.6863920 0.4944054 0.4261850 
#>      carb        am 
#> 0.2696248 0.2211775 

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> regr.mse 
#> 6.088367