Skip to contents

Classification tree with logistic regression models at the leaves. Calls RWeka::LMT() from RWeka.

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.LMT")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, mlr3extralearners, RWeka

Parameters

IdTypeDefaultLevelsRange
subsetuntyped--
na.actionuntyped--
BlogicalFALSETRUE, FALSE-
RlogicalFALSETRUE, FALSE-
ClogicalFALSETRUE, FALSE-
PlogicalFALSETRUE, FALSE-
Iinteger-\([1, \infty)\)
Minteger15\([1, \infty)\)
Wnumeric0\([0, 1]\)
AlogicalFALSETRUE, FALSE-
doNotMakeSplitPointActualValuelogicalFALSETRUE, FALSE-
output_debug_infologicalFALSETRUE, FALSE-
do_not_check_capabilitieslogicalFALSETRUE, FALSE-
num_decimal_placesinteger2\([1, \infty)\)
batch_sizeinteger100\([1, \infty)\)
optionsuntypedNULL-

Custom mlr3 parameters

  • output_debug_info:

    • original id: output-debug-info

  • do_not_check_capabilities:

    • original id: do-not-check-capabilities

  • num_decimal_places:

    • original id: num-decimal-places

  • batch_size:

    • original id: batch-size

  • Reason for change: This learner contains changed ids of the following control arguments since their ids contain irregular pattern

References

Landwehr, Niels, Hall, Mark, Frank, Eibe (2005). “Logistic model trees.” Machine learning, 59(1), 161–205.

See also

Author

henrifnk

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifLMT

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifLMT$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = mlr3::lrn("classif.LMT")
print(learner)
#> <LearnerClassifLMT:classif.LMT>: Tree-based Model
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, mlr3extralearners, RWeka
#> * Predict Types:  [response], prob
#> * Feature Types: integer, numeric, factor, ordered
#> * Properties: multiclass, twoclass

# Define a Task
task = mlr3::tsk("sonar")

# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> Logistic model tree 
#> ------------------
#> : LM_1:53/53 (139)
#> 
#> Number of Leaves  : 	1
#> 
#> Size of the Tree : 	1
#> LM_1:
#> Class M :
#> -2.94 + 
#> [V1] * 5.75 +
#> [V10] * -1.34 +
#> [V11] * 3.44 +
#> [V12] * 1.85 +
#> [V15] * -0.37 +
#> [V16] * -0.76 +
#> [V2] * 3.56 +
#> [V21] * 1.15 +
#> [V24] * 1.18 +
#> [V28] * -0.39 +
#> [V3] * -7.59 +
#> [V30] * 0.84 +
#> [V31] * -1.17 +
#> [V36] * -1.52 +
#> [V37] * -0.43 +
#> [V38] * 0.52 +
#> [V39] * 0.49 +
#> [V4] * 7.19 +
#> [V40] * -1.61 +
#> [V44] * 2.91 +
#> [V45] * 2.24 +
#> [V48] * 4.53 +
#> [V49] * 10.64 +
#> [V50] * -21.57 +
#> [V51] * 24.93 +
#> [V52] * 21.72 +
#> [V54] * 30.72 +
#> [V55] * -18.47 +
#> [V57] * -24.76 +
#> [V59] * 31.81 +
#> [V6] * -1.4 +
#> [V7] * -2.31 +
#> [V8] * -2.4 +
#> [V9] * 1.62
#> 
#> Class R :
#> 2.94 + 
#> [V1] * -5.75 +
#> [V10] * 1.34 +
#> [V11] * -3.44 +
#> [V12] * -1.85 +
#> [V15] * 0.37 +
#> [V16] * 0.76 +
#> [V2] * -3.56 +
#> [V21] * -1.15 +
#> [V24] * -1.18 +
#> [V28] * 0.39 +
#> [V3] * 7.59 +
#> [V30] * -0.84 +
#> [V31] * 1.17 +
#> [V36] * 1.52 +
#> [V37] * 0.43 +
#> [V38] * -0.52 +
#> [V39] * -0.49 +
#> [V4] * -7.19 +
#> [V40] * 1.61 +
#> [V44] * -2.91 +
#> [V45] * -2.24 +
#> [V48] * -4.53 +
#> [V49] * -10.64 +
#> [V50] * 21.57 +
#> [V51] * -24.93 +
#> [V52] * -21.72 +
#> [V54] * -30.72 +
#> [V55] * 18.47 +
#> [V57] * 24.76 +
#> [V59] * -31.81 +
#> [V6] * 1.4  +
#> [V7] * 2.31 +
#> [V8] * 2.4  +
#> [V9] * -1.62
#> 


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>   0.173913