Skip to contents

Multinomial Logistic Regression model with a ridge estimator. Calls RWeka::Logistic() from RWeka.

Custom mlr3 parameters

  • output_debug_info:

    • original id: output-debug-info

  • do_not_check_capabilities:

    • original id: do-not-check-capabilities

  • num_decimal_places:

    • original id: num-decimal-places

  • batch_size:

    • original id: batch-size

  • Reason for change: This learner contains changed ids of the following control arguments since their ids contain irregular pattern

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.logistic")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, RWeka

Parameters

IdTypeDefaultLevelsRange
subsetuntyped--
na.actionuntyped--
ClogicalFALSETRUE, FALSE-
Rnumeric-\((-\infty, \infty)\)
Minteger-1\((-\infty, \infty)\)
output_debug_infologicalFALSETRUE, FALSE-
do_not_check_capabilitieslogicalFALSETRUE, FALSE-
num_decimal_placesinteger2\([1, \infty)\)
batch_sizeinteger100\([1, \infty)\)
optionsuntypedNULL-

References

le Cessie, S., van Houwelingen, J.C. (1992). “Ridge Estimators in Logistic Regression.” Applied Statistics, 41(1), 191-201.

See also

Author

damirpolat

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifLogistic

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifLogistic$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = mlr3::lrn("classif.logistic")
print(learner)
#> <LearnerClassifLogistic:classif.logistic>: Multinomial Logistic Regression
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, RWeka
#> * Predict Types:  [response], prob
#> * Feature Types: logical, integer, numeric, factor, ordered
#> * Properties: missings, multiclass, twoclass

# Define a Task
task = mlr3::tsk("sonar")

# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> Logistic Regression with ridge parameter of 1.0E-8
#> Coefficients...
#>                               Class
#> Variable                          M
#> ===================================
#> V1                         972.3861
#> V10                        290.8686
#> V11                        -95.3154
#> V12                        102.5803
#> V13                        263.9428
#> V14                       -240.7248
#> V15                        229.9665
#> V16                       -154.5266
#> V17                       -289.5627
#> V18                         360.923
#> V19                       -219.4057
#> V2                        -421.8133
#> V20                         74.6116
#> V21                          2.3785
#> V22                        101.2254
#> V23                          4.7451
#> V24                        131.0988
#> V25                        -30.1107
#> V26                        -129.719
#> V27                        104.5638
#> V28                        -82.5915
#> V29                         -31.745
#> V3                        -230.1082
#> V30                        253.0465
#> V31                        -229.719
#> V32                        141.1334
#> V33                        -30.9638
#> V34                         -1.8354
#> V35                         -4.7523
#> V36                        -53.6382
#> V37                         88.1524
#> V38                       -138.0823
#> V39                        191.1364
#> V4                        -120.3186
#> V40                       -198.8482
#> V41                         41.5236
#> V42                         11.7203
#> V43                        -98.2833
#> V44                        -25.3354
#> V45                        190.9621
#> V46                         -93.612
#> V47                        340.4143
#> V48                       -849.1848
#> V49                       2530.7623
#> V5                         345.6242
#> V50                      -3880.4774
#> V51                       2953.8419
#> V52                        208.7079
#> V53                       3050.5521
#> V54                        409.7566
#> V55                       -222.2284
#> V56                       3242.8867
#> V57                      -3007.8884
#> V58                       -576.6471
#> V59                      -2086.1146
#> V6                           2.2864
#> V60                        117.1887
#> V7                          14.1527
#> V8                         -303.784
#> V9                           5.6449
#> Intercept                 -205.3723
#> 
#> 
#> Odds Ratios...
#>                               Class
#> Variable                          M
#> ===================================
#> V1                         Infinity
#> V10          2.1019185069640957E126
#> V11                               0
#> V12           3.5484851016804918E44
#> V13           4.254948080658715E114
#> V14                               0
#> V15            7.467453866369425E99
#> V16                               0
#> V17                               0
#> V18           5.583165644135464E156
#> V19                               0
#> V2                                0
#> V20           2.5317364364583453E32
#> V21                         10.7883
#> V22            9.154861878437914E43
#> V23                        115.0167
#> V24            8.619453284556147E56
#> V25                               0
#> V26                               0
#> V27           2.5792673427725075E45
#> V28                               0
#> V29                               0
#> V3                                0
#> V30           7.882980672076462E109
#> V31                               0
#> V32           1.9653521601549125E61
#> V33                               0
#> V34                          0.1596
#> V35                          0.0086
#> V36                               0
#> V37            1.923555615421264E38
#> V38                               0
#> V39           1.0220670304946964E83
#> V4                                0
#> V40                               0
#> V41          1.08010565203866419E18
#> V42                     123044.0927
#> V43                               0
#> V44                               0
#> V45            8.586185113878775E82
#> V46                               0
#> V47           6.919334143523433E147
#> V48                               0
#> V49                        Infinity
#> V5           1.2667058443792323E150
#> V50                               0
#> V51                        Infinity
#> V52             4.37210182496507E90
#> V53                        Infinity
#> V54           9.016130464311093E177
#> V55                               0
#> V56                        Infinity
#> V57                               0
#> V58                               0
#> V59                               0
#> V6                           9.8394
#> V60            7.841234848292064E50
#> V7                     1400997.9537
#> V8                                0
#> V9                         282.8375
#> 


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.2028986