Skip to contents

Multinomial Logistic Regression model with a ridge estimator. Calls RWeka::Logistic() from RWeka.

Custom mlr3 parameters

  • output_debug_info:

    • original id: output-debug-info

  • do_not_check_capabilities:

    • original id: do-not-check-capabilities

  • num_decimal_places:

    • original id: num-decimal-places

  • batch_size:

    • original id: batch-size

  • Reason for change: This learner contains changed ids of the following control arguments since their ids contain irregular pattern

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.logistic")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, RWeka

Parameters

IdTypeDefaultLevelsRange
subsetuntyped--
na.actionuntyped--
ClogicalFALSETRUE, FALSE-
Rnumeric-\((-\infty, \infty)\)
Minteger-1\((-\infty, \infty)\)
output_debug_infologicalFALSETRUE, FALSE-
do_not_check_capabilitieslogicalFALSETRUE, FALSE-
num_decimal_placesinteger2\([1, \infty)\)
batch_sizeinteger100\([1, \infty)\)
optionsuntypedNULL-

References

le Cessie, S., van Houwelingen, J.C. (1992). “Ridge Estimators in Logistic Regression.” Applied Statistics, 41(1), 191-201.

See also

Author

damirpolat

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifLogistic

Active bindings

marshaled

(logical(1))
Whether the learner has been marshaled.

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method marshal()

Marshal the learner's model.

Usage

LearnerClassifLogistic$marshal(...)

Arguments

...

(any)
Additional arguments passed to mlr3::marshal_model().


Method unmarshal()

Unmarshal the learner's model.

Usage

LearnerClassifLogistic$unmarshal(...)

Arguments

...

(any)
Additional arguments passed to mlr3::unmarshal_model().


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifLogistic$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = lrn("classif.logistic")
print(learner)
#> 
#> ── <LearnerClassifLogistic> (classif.logistic): Multinomial Logistic Regression 
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3 and RWeka
#> • Predict Types: [response] and prob
#> • Feature Types: logical, integer, numeric, factor, and ordered
#> • Encapsulation: none (fallback: -)
#> • Properties: marshal, missings, multiclass, and twoclass
#> • Other settings: use_weights = 'error'

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> Logistic Regression with ridge parameter of 1.0E-8
#> Coefficients...
#>                               Class
#> Variable                          M
#> ===================================
#> V1                        -319.7405
#> V10                        143.3096
#> V11                       -115.5473
#> V12                        359.1172
#> V13                        -59.5483
#> V14                       -163.0889
#> V15                         69.1769
#> V16                       -156.1123
#> V17                          78.473
#> V18                        133.4094
#> V19                       -151.5679
#> V2                        -289.6295
#> V20                        255.9344
#> V21                        -45.6872
#> V22                       -109.1203
#> V23                          87.523
#> V24                         84.2039
#> V25                       -149.1048
#> V26                         22.1749
#> V27                         88.3255
#> V28                        -55.7696
#> V29                         13.8322
#> V3                         520.3769
#> V30                        224.5219
#> V31                       -296.2414
#> V32                        139.2094
#> V33                         54.9288
#> V34                       -162.1537
#> V35                         60.2933
#> V36                        -41.2519
#> V37                        -14.6447
#> V38                       -106.3384
#> V39                        171.5428
#> V4                        -154.2462
#> V40                       -101.3427
#> V41                         68.2614
#> V42                         23.1633
#> V43                         40.0839
#> V44                         -1.6503
#> V45                        -88.4136
#> V46                        104.1644
#> V47                          7.1541
#> V48                       -204.7146
#> V49                       1142.6128
#> V5                        -328.8177
#> V50                      -2079.7725
#> V51                       4579.0345
#> V52                       -307.0909
#> V53                      -1830.1418
#> V54                       1113.6991
#> V55                      -3757.0179
#> V56                       1624.0653
#> V57                      -2101.5198
#> V58                       6103.8443
#> V59                       -799.8802
#> V6                         -123.138
#> V60                       2669.3503
#> V7                        -277.0438
#> V8                        -259.7176
#> V9                         190.5272
#> Intercept                  -120.381
#> 
#> 
#> Odds Ratios...
#>                               Class
#> Variable                          M
#> ===================================
#> V1                                0
#> V10           1.7321543514070125E62
#> V11                               0
#> V12             9.1749201545192E155
#> V13                               0
#> V14                               0
#> V15           1.1044579515276084E30
#> V16                               0
#> V17            1.203300760179293E34
#> V18            8.688595791903211E57
#> V19                               0
#> V2                                0
#> V20          1.4154390456799912E111
#> V21                               0
#> V22                               0
#> V23            1.025101834012129E38
#> V24           3.7093507938871767E36
#> V25                               0
#> V26                 4270129898.9739
#> V27           2.2871110725518256E38
#> V28                               0
#> V29                    1016787.3553
#> V3            9.927384581480468E225
#> V30           3.2257447898307643E97
#> V31                               0
#> V32           2.8700098440661738E60
#> V33             7.16621468708653E23
#> V34                               0
#> V35           1.5313198806700384E26
#> V36                               0
#> V37                               0
#> V38                               0
#> V39           3.1629352239566695E74
#> V4                                0
#> V40                               0
#> V41            4.421289656227932E29
#> V42           1.1473853828555153E10
#> V43          2.55984258125421856E17
#> V44                           0.192
#> V45                               0
#> V46           1.7299888103127722E45
#> V47                        1279.401
#> V48                               0
#> V49                        Infinity
#> V5                                0
#> V50                               0
#> V51                        Infinity
#> V52                               0
#> V53                               0
#> V54                        Infinity
#> V55                               0
#> V56                        Infinity
#> V57                               0
#> V58                        Infinity
#> V59                               0
#> V6                                0
#> V60                        Infinity
#> V7                                0
#> V8                                0
#> V9              5.55793014673588E82
#> 


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.3768116