Oblique Random Forest Classifier
Source:R/learner_aorsf_classif_aorsf.R
mlr_learners_classif.aorsf.RdAccelerated oblique random classification forest.
Calls aorsf::orsf() from aorsf.
Note that although the learner has the property "missing" and it can in
principle deal with missing values, the behaviour has to be configured using
the parameter na_action.
Initial parameter values
n_thread: This parameter is initialized to 1 (default is 0) to avoid conflicts with the mlr3 parallelization.pred_simplifyhas to be TRUE, otherwise response is NA in prediction
Meta Information
Task type: “classif”
Predict Types: “response”, “prob”
Feature Types: “integer”, “numeric”, “factor”, “ordered”
Required Packages: mlr3, mlr3extralearners, aorsf
Parameters
| Id | Type | Default | Levels | Range |
| attach_data | logical | TRUE | TRUE, FALSE | - |
| epsilon | numeric | 1e-09 | \([0, \infty)\) | |
| importance | character | anova | none, anova, negate, permute | - |
| importance_max_pvalue | numeric | 0.01 | \([1e-04, 0.9999]\) | |
| leaf_min_events | integer | 1 | \([1, \infty)\) | |
| leaf_min_obs | integer | 5 | \([1, \infty)\) | |
| max_iter | integer | 20 | \([1, \infty)\) | |
| method | character | glm | glm, net, pca, random | - |
| mtry | integer | NULL | \([1, \infty)\) | |
| mtry_ratio | numeric | - | \([0, 1]\) | |
| n_retry | integer | 3 | \([0, \infty)\) | |
| n_split | integer | 5 | \([1, \infty)\) | |
| n_thread | integer | - | \([0, \infty)\) | |
| n_tree | integer | 500 | \([1, \infty)\) | |
| na_action | character | fail | fail, impute_meanmode | - |
| net_mix | numeric | 0.5 | \((-\infty, \infty)\) | |
| oobag | logical | FALSE | TRUE, FALSE | - |
| oobag_eval_every | integer | NULL | \([1, \infty)\) | |
| oobag_fun | untyped | NULL | - | |
| oobag_pred_type | character | prob | none, leaf, prob, class | - |
| pred_aggregate | logical | TRUE | TRUE, FALSE | - |
| sample_fraction | numeric | 0.632 | \([0, 1]\) | |
| sample_with_replacement | logical | TRUE | TRUE, FALSE | - |
| scale_x | logical | FALSE | TRUE, FALSE | - |
| split_min_events | integer | 5 | \([1, \infty)\) | |
| split_min_obs | integer | 10 | \([1, \infty)\) | |
| split_min_stat | numeric | NULL | \([0, \infty)\) | |
| split_rule | character | gini | gini, cstat | - |
| target_df | integer | NULL | \([1, \infty)\) | |
| tree_seeds | integer | NULL | \([1, \infty)\) | |
| verbose_progress | logical | FALSE | TRUE, FALSE | - |
See also
as.data.table(mlr_learners)for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/chapters/chapter2/data_and_basic_modeling.html#sec-learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifObliqueRandomForest
Methods
Inherited methods
mlr3::Learner$base_learner()mlr3::Learner$configure()mlr3::Learner$encapsulate()mlr3::Learner$format()mlr3::Learner$help()mlr3::Learner$predict()mlr3::Learner$predict_newdata()mlr3::Learner$print()mlr3::Learner$reset()mlr3::Learner$selected_features()mlr3::Learner$train()mlr3::LearnerClassif$predict_newdata_fast()
Method oob_error()
OOB concordance error extracted from the model slot
eval_oobag$stat_values
Examples
# Define the Learner
learner = lrn("classif.aorsf")
print(learner)
#>
#> ── <LearnerClassifObliqueRandomForest> (classif.aorsf): Oblique Random Forest Cl
#> • Model: -
#> • Parameters: n_thread=1
#> • Packages: mlr3, mlr3extralearners, and aorsf
#> • Predict Types: [response] and prob
#> • Feature Types: integer, numeric, factor, and ordered
#> • Encapsulation: none (fallback: -)
#> • Properties: importance, missings, multiclass, oob_error, twoclass, and
#> weights
#> • Other settings: use_weights = 'use', predict_raw = 'FALSE'
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#> ---------- Oblique random classification forest
#>
#> Linear combinations: Logistic regression
#> N observations: 139
#> N classes: 2
#> N trees: 500
#> N predictors total: 60
#> N predictors per node: 8
#> Average leaves per tree: 4.73
#> Min observations in leaf: 5
#> OOB stat value: 0.92
#> OOB stat type: AUC-ROC
#> Variable importance: anova
#>
#> -----------------------------------------
print(learner$importance())
#> V11 V52 V12 V10 V22 V36
#> 0.436464088 0.427230047 0.357843137 0.355769231 0.320754717 0.312195122
#> V21 V49 V9 V45 V51 V23
#> 0.296650718 0.295774648 0.289855072 0.289719626 0.266990291 0.244635193
#> V13 V1 V48 V46 V37 V44
#> 0.242857143 0.236111111 0.229357798 0.225490196 0.222797927 0.205479452
#> V47 V20 V29 V4 V24 V35
#> 0.204878049 0.192893401 0.172043011 0.157635468 0.146788991 0.146739130
#> V19 V43 V28 V2 V18 V57
#> 0.144230769 0.142156863 0.142105263 0.137931034 0.120418848 0.117647059
#> V16 V31 V40 V34 V41 V14
#> 0.112903226 0.109375000 0.109004739 0.100961538 0.091891892 0.091370558
#> V53 V30 V17 V15 V8 V25
#> 0.089473684 0.084577114 0.079096045 0.075757576 0.071090047 0.070652174
#> V27 V5 V32 V54 V60 V33
#> 0.069444444 0.066298343 0.065573770 0.064356436 0.061224490 0.061032864
#> V38 V42 V50 V26 V3 V7
#> 0.056994819 0.052631579 0.051282051 0.042857143 0.039603960 0.037234043
#> V6 V39 V58 V55 V56 V59
#> 0.035000000 0.026200873 0.025510204 0.019704433 0.009615385 0.000000000
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> classif.ce
#> 0.2753623