Skip to contents

Multinomial Logistic Regression model with a ridge estimator. Calls RWeka::Logistic() from RWeka.

Custom mlr3 parameters

  • output_debug_info:

    • original id: output-debug-info

  • do_not_check_capabilities:

    • original id: do-not-check-capabilities

  • num_decimal_places:

    • original id: num-decimal-places

  • batch_size:

    • original id: batch-size

  • Reason for change: This learner contains changed ids of the following control arguments since their ids contain irregular pattern

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.logistic")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, RWeka

Parameters

IdTypeDefaultLevelsRange
subsetuntyped--
na.actionuntyped--
ClogicalFALSETRUE, FALSE-
Rnumeric-\((-\infty, \infty)\)
Minteger-1\((-\infty, \infty)\)
output_debug_infologicalFALSETRUE, FALSE-
do_not_check_capabilitieslogicalFALSETRUE, FALSE-
num_decimal_placesinteger2\([1, \infty)\)
batch_sizeinteger100\([1, \infty)\)
optionsuntypedNULL-

References

le Cessie, S., van Houwelingen, J.C. (1992). “Ridge Estimators in Logistic Regression.” Applied Statistics, 41(1), 191-201.

See also

Author

damirpolat

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifLogistic

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifLogistic$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = mlr3::lrn("classif.logistic")
print(learner)
#> <LearnerClassifLogistic:classif.logistic>: Multinomial Logistic Regression
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, RWeka
#> * Predict Types:  [response], prob
#> * Feature Types: logical, integer, numeric, factor, ordered
#> * Properties: missings, multiclass, twoclass

# Define a Task
task = mlr3::tsk("sonar")

# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> Logistic Regression with ridge parameter of 1.0E-8
#> Coefficients...
#>                               Class
#> Variable                          M
#> ===================================
#> V1                         126.6623
#> V10                         16.4023
#> V11                        -25.6143
#> V12                        255.1191
#> V13                        -59.8847
#> V14                        -44.6748
#> V15                        155.8188
#> V16                       -147.8675
#> V17                        -39.4575
#> V18                         69.0097
#> V19                        -22.2105
#> V2                         297.7799
#> V20                        124.0989
#> V21                          13.314
#> V22                        -89.1604
#> V23                        -82.6749
#> V24                        235.5432
#> V25                         -3.7101
#> V26                        -176.012
#> V27                        122.9785
#> V28                        -34.5803
#> V29                        -70.8677
#> V3                       -1266.1626
#> V30                        169.9365
#> V31                       -183.9388
#> V32                         77.1596
#> V33                         14.2359
#> V34                       -186.0841
#> V35                        232.0078
#> V36                        -85.5062
#> V37                        -157.258
#> V38                         96.8776
#> V39                          6.3471
#> V4                         526.8573
#> V40                        -34.7045
#> V41                        -48.3228
#> V42                         94.9773
#> V43                        -144.529
#> V44                        156.4525
#> V45                          37.536
#> V46                         84.3824
#> V47                        151.7934
#> V48                       -130.1998
#> V49                        811.7201
#> V5                         -74.8193
#> V50                       -912.1905
#> V51                       -962.0994
#> V52                        676.2085
#> V53                       2214.2212
#> V54                       1573.4411
#> V55                      -2539.9879
#> V56                      -1617.2445
#> V57                        -87.7868
#> V58                        107.2792
#> V59                       1912.5582
#> V6                         250.4293
#> V60                        -963.335
#> V7                        -624.0902
#> V8                        -161.3361
#> V9                         265.6141
#> Intercept                  -56.6901
#> 
#> 
#> Odds Ratios...
#>                               Class
#> Variable                          M
#> ===================================
#> V1            1.0203581867611569E55
#> V10                   13287228.0996
#> V11                               0
#> V12           6.263578810260428E110
#> V13                               0
#> V14                               0
#> V15           4.6906518866336226E67
#> V16                               0
#> V17                               0
#> V18            9.344363338221749E29
#> V19                               0
#> V2           2.1093959033123522E129
#> V20            7.860470500855482E53
#> V21                     605637.1107
#> V22                               0
#> V23                               0
#> V24          1.9730267287681956E102
#> V25                          0.0245
#> V26                               0
#> V27           2.5638344983929824E53
#> V28                               0
#> V29                               0
#> V3                                0
#> V30            6.345827116536759E73
#> V31                               0
#> V32            3.235921106937759E33
#> V33                    1522506.2455
#> V34                               0
#> V35          5.7503765995980766E100
#> V36                               0
#> V37                               0
#> V38           1.1841102746078983E42
#> V39                        570.8376
#> V4            6.474467644874807E228
#> V40                               0
#> V41                               0
#> V42            1.770609285712916E41
#> V43                               0
#> V44            8.840239240906072E67
#> V45           2.0029006323039576E16
#> V46           4.4343661762691566E36
#> V47            8.376401502867353E65
#> V48                               0
#> V49                        Infinity
#> V5                                0
#> V50                               0
#> V51                               0
#> V52          4.7166850291763065E293
#> V53                        Infinity
#> V54                        Infinity
#> V55                               0
#> V56                               0
#> V57                               0
#> V58           3.8974508639124203E46
#> V59                        Infinity
#> V6            5.754993891974163E108
#> V60                               0
#> V7                                0
#> V8                                0
#> V9            2.263256348549002E115
#> 


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.3768116