Classification Decision Tree Learner
Source:R/learner_evtree_classif_evtree.R
mlr_learners_classif.evtree.RdEvolutionary learning of globally optimal classification trees.
Calls evtree::evtree() fromevtree.
Initial parameter values
pmutatemajor, pmutateminor, pcrossover, psplit, and pprune,
are scaled internally to sum to 100.
See also
as.data.table(mlr_learners)for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/basics.html#learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifEvtree
Methods
Inherited methods
mlr3::Learner$base_learner()mlr3::Learner$configure()mlr3::Learner$encapsulate()mlr3::Learner$format()mlr3::Learner$help()mlr3::Learner$predict()mlr3::Learner$predict_newdata()mlr3::Learner$print()mlr3::Learner$reset()mlr3::Learner$selected_features()mlr3::Learner$train()mlr3::LearnerClassif$predict_newdata_fast()
Examples
# Define the Learner
learner = lrn("classif.evtree")
print(learner)
#>
#> ── <LearnerClassifEvtree> (classif.evtree): Evolutionary learning of globally op
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3 and evtree
#> • Predict Types: [response] and prob
#> • Feature Types: integer, numeric, and factor
#> • Encapsulation: none (fallback: -)
#> • Properties: multiclass, twoclass, and weights
#> • Other settings: use_weights = 'use'
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#>
#> Model formula:
#> Class ~ V1 + V10 + V11 + V12 + V13 + V14 + V15 + V16 + V17 +
#> V18 + V19 + V2 + V20 + V21 + V22 + V23 + V24 + V25 + V26 +
#> V27 + V28 + V29 + V3 + V30 + V31 + V32 + V33 + V34 + V35 +
#> V36 + V37 + V38 + V39 + V4 + V40 + V41 + V42 + V43 + V44 +
#> V45 + V46 + V47 + V48 + V49 + V5 + V50 + V51 + V52 + V53 +
#> V54 + V55 + V56 + V57 + V58 + V59 + V6 + V60 + V7 + V8 +
#> V9
#>
#> Fitted party:
#> [1] root
#> | [2] V37 < 0.5501
#> | | [3] V4 < 0.0608
#> | | | [4] V9 < 0.1038: R (n = 27, err = 0.0%)
#> | | | [5] V9 >= 0.1038
#> | | | | [6] V49 < 0.0289: R (n = 16, err = 12.5%)
#> | | | | [7] V49 >= 0.0289: M (n = 33, err = 9.1%)
#> | | [8] V4 >= 0.0608: M (n = 30, err = 13.3%)
#> | [9] V37 >= 0.5501
#> | | [10] V46 < 0.2047: R (n = 21, err = 9.5%)
#> | | [11] V46 >= 0.2047: M (n = 12, err = 16.7%)
#>
#> Number of inner nodes: 5
#> Number of terminal nodes: 6
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> classif.ce
#> 0.2898551