Classification Random Forest Learner
Source:R/learner_randomForest_classif_randomForest.R
mlr_learners_classif.randomForest.RdRandom forest for classification.
Calls randomForest::randomForest() from randomForest.
Meta Information
Task type: “classif”
Predict Types: “response”, “prob”
Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”
Required Packages: mlr3, mlr3extralearners, randomForest
Parameters
| Id | Type | Default | Levels | Range |
| ntree | integer | 500 | \([1, \infty)\) | |
| mtry | integer | - | \([1, \infty)\) | |
| replace | logical | TRUE | TRUE, FALSE | - |
| classwt | untyped | NULL | - | |
| cutoff | untyped | - | - | |
| strata | untyped | - | - | |
| sampsize | untyped | - | - | |
| nodesize | integer | 1 | \([1, \infty)\) | |
| maxnodes | integer | - | \([1, \infty)\) | |
| importance | character | FALSE | accuracy, gini, none | - |
| localImp | logical | FALSE | TRUE, FALSE | - |
| proximity | logical | FALSE | TRUE, FALSE | - |
| oob.prox | logical | - | TRUE, FALSE | - |
| norm.votes | logical | TRUE | TRUE, FALSE | - |
| do.trace | logical | FALSE | TRUE, FALSE | - |
| keep.forest | logical | TRUE | TRUE, FALSE | - |
| keep.inbag | logical | FALSE | TRUE, FALSE | - |
| predict.all | logical | FALSE | TRUE, FALSE | - |
| nodes | logical | FALSE | TRUE, FALSE | - |
References
Breiman, Leo (2001). “Random Forests.” Machine Learning, 45(1), 5–32. ISSN 1573-0565, doi:10.1023/A:1010933404324 .
See also
as.data.table(mlr_learners)for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/basics.html#learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifRandomForest
Methods
Inherited methods
mlr3::Learner$base_learner()mlr3::Learner$configure()mlr3::Learner$encapsulate()mlr3::Learner$format()mlr3::Learner$help()mlr3::Learner$predict()mlr3::Learner$predict_newdata()mlr3::Learner$print()mlr3::Learner$reset()mlr3::Learner$selected_features()mlr3::Learner$train()mlr3::LearnerClassif$predict_newdata_fast()
Method importance()
The importance scores are extracted from the slot importance.
Parameter 'importance' must be set to either "accuracy" or "gini".
Returns
Named numeric().
Examples
# Define the Learner
learner = lrn("classif.randomForest", importance = "accuracy")
print(learner)
#>
#> ── <LearnerClassifRandomForest> (classif.randomForest): Random Forest ──────────
#> • Model: -
#> • Parameters: importance=accuracy
#> • Packages: mlr3, mlr3extralearners, and randomForest
#> • Predict Types: [response] and prob
#> • Feature Types: logical, integer, numeric, factor, and ordered
#> • Encapsulation: none (fallback: -)
#> • Properties: importance, multiclass, oob_error, twoclass, and weights
#> • Other settings: use_weights = 'use'
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#>
#> Call:
#> randomForest(formula = formula, data = data, classwt = classwt, cutoff = cutoff, importance = TRUE)
#> Type of random forest: classification
#> Number of trees: 500
#> No. of variables tried at each split: 7
#>
#> OOB estimate of error rate: 13.67%
#> Confusion matrix:
#> M R class.error
#> M 55 10 0.1538462
#> R 9 65 0.1216216
print(learner$importance())
#> V11 V12 V51 V37 V36
#> 2.961915e-02 2.038097e-02 1.359420e-02 1.309458e-02 1.119992e-02
#> V10 V35 V21 V5 V9
#> 1.038850e-02 8.786844e-03 8.302010e-03 8.182046e-03 7.411557e-03
#> V20 V27 V31 V52 V6
#> 7.399631e-03 5.809263e-03 5.620521e-03 4.302922e-03 4.134273e-03
#> V28 V13 V17 V49 V8
#> 4.126226e-03 3.975124e-03 3.885517e-03 3.746687e-03 3.185719e-03
#> V16 V47 V32 V44 V34
#> 3.036852e-03 2.891373e-03 2.704739e-03 2.649754e-03 2.374604e-03
#> V46 V18 V48 V29 V45
#> 2.294651e-03 2.190581e-03 2.142964e-03 1.964688e-03 1.879002e-03
#> V19 V23 V33 V26 V15
#> 1.857487e-03 1.804511e-03 1.758691e-03 1.749341e-03 1.503312e-03
#> V24 V59 V4 V30 V25
#> 1.478331e-03 1.304623e-03 1.289170e-03 1.225222e-03 1.199366e-03
#> V39 V22 V58 V40 V60
#> 1.186566e-03 1.146700e-03 1.001019e-03 7.777476e-04 6.524044e-04
#> V38 V42 V41 V3 V2
#> 6.124710e-04 6.101710e-04 5.921023e-04 5.651946e-04 5.394374e-04
#> V14 V54 V1 V53 V7
#> 5.210223e-04 4.563725e-04 3.344990e-04 1.959320e-04 8.978948e-05
#> V43 V57 V56 V55 V50
#> 8.653027e-05 7.855835e-05 -9.553726e-05 -4.602891e-04 -4.648362e-04
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> classif.ce
#> 0.2028986