Oblique Random Forest Classifier
Source:R/learner_aorsf_classif_aorsf.R
mlr_learners_classif.aorsf.Rd
Accelerated oblique random classification forest.
Calls aorsf::orsf()
from aorsf.
Note that although the learner has the property "missing"
and it can in
principle deal with missing values, the behaviour has to be configured using
the parameter na_action
.
Initial parameter values
n_thread
: This parameter is initialized to 1 (default is 0) to avoid conflicts with the mlr3 parallelization.pred_simplify
has to be TRUE, otherwise response is NA in prediction
See also
as.data.table(mlr_learners)
for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/basics.html#learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner
-> mlr3::LearnerClassif
-> LearnerClassifObliqueRandomForest
Methods
Inherited methods
mlr3::Learner$base_learner()
mlr3::Learner$configure()
mlr3::Learner$encapsulate()
mlr3::Learner$format()
mlr3::Learner$help()
mlr3::Learner$predict()
mlr3::Learner$predict_newdata()
mlr3::Learner$print()
mlr3::Learner$reset()
mlr3::Learner$selected_features()
mlr3::Learner$train()
mlr3::LearnerClassif$predict_newdata_fast()
Method oob_error()
OOB concordance error extracted from the model slot
eval_oobag$stat_values
Examples
# Define the Learner
learner = lrn("classif.aorsf", importance = "anova")
print(learner)
#>
#> ── <LearnerClassifObliqueRandomForest> (classif.aorsf): Oblique Random Forest Cl
#> • Model: -
#> • Parameters: importance=anova, n_thread=1
#> • Packages: mlr3, mlr3extralearners, and aorsf
#> • Predict Types: [response] and prob
#> • Feature Types: integer, numeric, factor, and ordered
#> • Encapsulation: none (fallback: -)
#> • Properties: importance, missings, multiclass, oob_error, twoclass, and
#> weights
#> • Other settings: use_weights = 'use'
# Define a Task
task = tsk("breast_cancer")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#> ---------- Oblique random classification forest
#>
#> Linear combinations: Logistic regression
#> N observations: 458
#> N classes: 2
#> N trees: 500
#> N predictors total: 9
#> N predictors per node: 3
#> Average leaves per tree: 3.224
#> Min observations in leaf: 5
#> OOB stat value: 0.99
#> OOB stat type: AUC-ROC
#> Variable importance: anova
#>
#> -----------------------------------------
print(learner$importance())
#> bare_nuclei cl_thickness bl_cromatin cell_shape normal_nucleoli
#> 0.5589354 0.5093168 0.5045992 0.4427481 0.4207077
#> cell_size marg_adhesion mitoses epith_c_size
#> 0.4178344 0.3569588 0.2880795 0.2866324
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> classif.ce
#> 0.02222222