Skip to contents

Classification Partition Tree where a significance test is used to determine the univariate splits. Calls partykit::ctree() from partykit.

## Dictionary

This Learner can be instantiated via lrn():

lrn("classif.ctree")

## Meta Information

• Task type: “classif”

• Predict Types: “response”, “prob”

• Feature Types: “integer”, “numeric”, “factor”, “ordered”

• Required Packages: mlr3, mlr3extralearners, partykit, sandwich, coin

## Parameters

 Id Type Default Levels Range teststat character quadratic quadratic, maximum - splitstat character quadratic quadratic, maximum - splittest logical FALSE TRUE, FALSE - testtype character Bonferroni Bonferroni, MonteCarlo, Univariate, Teststatistic - nmax untyped - - alpha numeric 0.05 $$[0, 1]$$ mincriterion numeric 0.95 $$[0, 1]$$ logmincriterion numeric - $$(-\infty, \infty)$$ minsplit integer 20 $$[1, \infty)$$ minbucket integer 7 $$[1, \infty)$$ minprob numeric 0.01 $$[0, 1]$$ stump logical FALSE TRUE, FALSE - lookahead logical FALSE TRUE, FALSE - MIA logical FALSE TRUE, FALSE - nresample integer 9999 $$[1, \infty)$$ tol numeric - $$[0, \infty)$$ maxsurrogate integer 0 $$[0, \infty)$$ numsurrogate logical FALSE TRUE, FALSE - mtry integer Inf $$[0, \infty)$$ maxdepth integer Inf $$[0, \infty)$$ multiway logical FALSE TRUE, FALSE - splittry integer 2 $$[0, \infty)$$ intersplit logical FALSE TRUE, FALSE - majority logical FALSE TRUE, FALSE - caseweights logical FALSE TRUE, FALSE - maxvar integer - $$[1, \infty)$$ applyfun untyped - - cores integer NULL $$(-\infty, \infty)$$ saveinfo logical TRUE TRUE, FALSE - update logical FALSE TRUE, FALSE - splitflavour character ctree ctree, exhaustive - offset untyped - - cluster untyped - - scores untyped - - doFit logical TRUE TRUE, FALSE - maxpts integer 25000 $$(-\infty, \infty)$$ abseps numeric 0.001 $$[0, \infty)$$ releps numeric 0 $$[0, \infty)$$

## References

Hothorn T, Zeileis A (2015). “partykit: A Modular Toolkit for Recursive Partytioning in R.” Journal of Machine Learning Research, 16(118), 3905-3909. http://jmlr.org/papers/v16/hothorn15a.html.

Hothorn T, Hornik K, Zeileis A (2006). “Unbiased Recursive Partitioning: A Conditional Inference Framework.” Journal of Computational and Graphical Statistics, 15(3), 651–674. doi:10.1198/106186006x133933 , https://doi.org/10.1198/106186006x133933.

sumny

## Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifCTree

## Methods

### Public methods

Inherited methods

### Method new()

Creates a new instance of this R6 class.

#### Arguments

deep

Whether to make a deep clone.

## Examples

# Define the Learner
learner = mlr3::lrn("classif.ctree")
print(learner)
#> <LearnerClassifCTree:classif.ctree>: Conditional Inference Tree
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, mlr3extralearners, partykit, sandwich, coin
#> * Predict Types:  [response], prob
#> * Feature Types: integer, numeric, factor, ordered
#> * Properties: multiclass, twoclass, weights

# Define a Task
task = mlr3::tsk("sonar")

# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model) #> #> Model formula: #> Class ~ V1 + V10 + V11 + V12 + V13 + V14 + V15 + V16 + V17 + #> V18 + V19 + V2 + V20 + V21 + V22 + V23 + V24 + V25 + V26 + #> V27 + V28 + V29 + V3 + V30 + V31 + V32 + V33 + V34 + V35 + #> V36 + V37 + V38 + V39 + V4 + V40 + V41 + V42 + V43 + V44 + #> V45 + V46 + V47 + V48 + V49 + V5 + V50 + V51 + V52 + V53 + #> V54 + V55 + V56 + V57 + V58 + V59 + V6 + V60 + V7 + V8 + #> V9 #> #> Fitted party: #> [1] root #> | [2] V11 <= 0.1675 #> | | [3] V4 <= 0.0505: R (n = 42, err = 4.8%) #> | | [4] V4 > 0.0505: M (n = 7, err = 28.6%) #> | [5] V11 > 0.1675 #> | | [6] V17 <= 0.9039: M (n = 82, err = 19.5%) #> | | [7] V17 > 0.9039: R (n = 8, err = 12.5%) #> #> Number of inner nodes: 3 #> Number of terminal nodes: 4 # Make predictions for the test rows predictions = learner$predict(task, row_ids = ids$test) # Score the predictions predictions$score()
#> classif.ce
#>  0.3043478