Classification Decision Tree Learner
Source:R/learner_evtree_classif_evtree.R
mlr_learners_classif.evtree.RdEvolutionary learning of globally optimal classification trees.
Calls evtree::evtree() fromevtree.
Initial parameter values
pmutatemajor, pmutateminor, pcrossover, psplit, and pprune,
are scaled internally to sum to 100.
See also
as.data.table(mlr_learners)for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/basics.html#learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifEvtree
Methods
Inherited methods
mlr3::Learner$base_learner()mlr3::Learner$configure()mlr3::Learner$encapsulate()mlr3::Learner$format()mlr3::Learner$help()mlr3::Learner$predict()mlr3::Learner$predict_newdata()mlr3::Learner$print()mlr3::Learner$reset()mlr3::Learner$selected_features()mlr3::Learner$train()mlr3::LearnerClassif$predict_newdata_fast()
Examples
# Define the Learner
learner = lrn("classif.evtree")
print(learner)
#>
#> ── <LearnerClassifEvtree> (classif.evtree): Evolutionary learning of globally op
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3 and evtree
#> • Predict Types: [response] and prob
#> • Feature Types: integer, numeric, and factor
#> • Encapsulation: none (fallback: -)
#> • Properties: multiclass, twoclass, and weights
#> • Other settings: use_weights = 'use'
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#>
#> Model formula:
#> Class ~ V1 + V10 + V11 + V12 + V13 + V14 + V15 + V16 + V17 +
#> V18 + V19 + V2 + V20 + V21 + V22 + V23 + V24 + V25 + V26 +
#> V27 + V28 + V29 + V3 + V30 + V31 + V32 + V33 + V34 + V35 +
#> V36 + V37 + V38 + V39 + V4 + V40 + V41 + V42 + V43 + V44 +
#> V45 + V46 + V47 + V48 + V49 + V5 + V50 + V51 + V52 + V53 +
#> V54 + V55 + V56 + V57 + V58 + V59 + V6 + V60 + V7 + V8 +
#> V9
#>
#> Fitted party:
#> [1] root
#> | [2] V40 < 0.6121
#> | | [3] V37 < 0.4786
#> | | | [4] V11 < 0.1995
#> | | | | [5] V54 < 0.0172: R (n = 31, err = 3.2%)
#> | | | | [6] V54 >= 0.0172: M (n = 8, err = 12.5%)
#> | | | [7] V11 >= 0.1995: M (n = 52, err = 11.5%)
#> | | [8] V37 >= 0.4786
#> | | | [9] V45 < 0.2662: R (n = 24, err = 4.2%)
#> | | | [10] V45 >= 0.2662: M (n = 15, err = 6.7%)
#> | [11] V40 >= 0.6121: R (n = 9, err = 11.1%)
#>
#> Number of inner nodes: 5
#> Number of terminal nodes: 6
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> classif.ce
#> 0.3478261