Skip to contents

Multinomial Logistic Regression model with a ridge estimator. Calls RWeka::Logistic() from RWeka.

Custom mlr3 parameters

  • output_debug_info:

    • original id: output-debug-info

  • do_not_check_capabilities:

    • original id: do-not-check-capabilities

  • num_decimal_places:

    • original id: num-decimal-places

  • batch_size:

    • original id: batch-size

  • Reason for change: This learner contains changed ids of the following control arguments since their ids contain irregular pattern

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.logistic")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, RWeka

Parameters

IdTypeDefaultLevelsRange
subsetuntyped--
na.actionuntyped--
ClogicalFALSETRUE, FALSE-
Rnumeric-\((-\infty, \infty)\)
Minteger-1\((-\infty, \infty)\)
output_debug_infologicalFALSETRUE, FALSE-
do_not_check_capabilitieslogicalFALSETRUE, FALSE-
num_decimal_placesinteger2\([1, \infty)\)
batch_sizeinteger100\([1, \infty)\)
optionsuntypedNULL-

References

le Cessie, S., van Houwelingen, J.C. (1992). “Ridge Estimators in Logistic Regression.” Applied Statistics, 41(1), 191-201.

See also

Author

damirpolat

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifLogistic

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifLogistic$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = mlr3::lrn("classif.logistic")
print(learner)
#> <LearnerClassifLogistic:classif.logistic>: Multinomial Logistic Regression
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, RWeka
#> * Predict Types:  [response], prob
#> * Feature Types: logical, integer, numeric, factor, ordered
#> * Properties: missings, multiclass, twoclass

# Define a Task
task = mlr3::tsk("sonar")

# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> Logistic Regression with ridge parameter of 1.0E-8
#> Coefficients...
#>                               Class
#> Variable                          M
#> ===================================
#> V1                         733.5445
#> V10                         195.232
#> V11                       -364.0916
#> V12                        390.4501
#> V13                        -35.2359
#> V14                        -72.0029
#> V15                         33.4353
#> V16                       -137.2323
#> V17                         30.7406
#> V18                         41.3565
#> V19                        -39.1134
#> V2                         893.5245
#> V20                        170.4279
#> V21                       -189.8346
#> V22                         11.7356
#> V23                        139.5471
#> V24                         11.4528
#> V25                         10.2996
#> V26                       -114.7194
#> V27                         254.679
#> V28                       -320.3922
#> V29                         79.3297
#> V3                       -1942.0212
#> V30                        318.7298
#> V31                       -457.7951
#> V32                        158.2991
#> V33                         38.4505
#> V34                       -206.4537
#> V35                        239.3246
#> V36                        -58.5228
#> V37                       -283.5473
#> V38                        120.8432
#> V39                         57.7903
#> V4                        1156.7887
#> V40                        -49.7414
#> V41                         41.4745
#> V42                       -162.7426
#> V43                        304.6831
#> V44                       -270.2434
#> V45                         121.163
#> V46                         58.0014
#> V47                        270.6726
#> V48                        704.6193
#> V49                        627.9559
#> V5                        -333.9649
#> V50                      -4823.6422
#> V51                        -13.2875
#> V52                        1873.113
#> V53                       2621.2681
#> V54                       1051.0112
#> V55                        -46.4369
#> V56                       -1054.498
#> V57                       1570.7881
#> V58                       1242.2772
#> V59                       1400.8704
#> V6                         122.4493
#> V60                      -1557.7076
#> V7                        -191.2181
#> V8                        -493.9877
#> V9                         424.6157
#> Intercept                 -108.7556
#> 
#> 
#> Odds Ratios...
#>                               Class
#> Variable                          M
#> ===================================
#> V1                         Infinity
#> V10            6.140336069602938E84
#> V11                               0
#> V12           3.718166273282378E169
#> V13                               0
#> V14                               0
#> V15             3.31703224473088E14
#> V16                               0
#> V17           2.2411177022159086E13
#> V18           9.1387309052609254E17
#> V19                               0
#> V2                         Infinity
#> V20           1.0372917197805367E74
#> V21                               0
#> V22                     124944.6764
#> V23           4.0230474256105404E60
#> V24                      94164.9364
#> V25                      29720.0404
#> V26                               0
#> V27          4.0334463460270255E110
#> V28                               0
#> V29           2.8342189878903416E34
#> V3                                0
#> V30          2.6460087024209776E138
#> V31                               0
#> V32            5.602950563511022E68
#> V33           4.9983402122977712E16
#> V34                               0
#> V35           8.656322083388165E103
#> V36                               0
#> V37                               0
#> V38           3.0307030239318224E52
#> V39           1.2531652365954075E25
#> V4                         Infinity
#> V40                               0
#> V41          1.02835755296345114E18
#> V42                               0
#> V43           2.099833344853066E132
#> V44                               0
#> V45           4.1727821995001807E52
#> V46           1.5477549609542595E25
#> V47           3.561283342482355E117
#> V48          1.0286681621613908E306
#> V49           5.221340756994553E272
#> V5                                0
#> V50                               0
#> V51                               0
#> V52                        Infinity
#> V53                        Infinity
#> V54                        Infinity
#> V55                               0
#> V56                               0
#> V57                        Infinity
#> V58                        Infinity
#> V59                        Infinity
#> V6            1.5103394615225896E53
#> V60                               0
#> V7                                0
#> V8                                0
#> V9            2.560074669168945E184
#> 


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.2318841