Skip to contents

Multinomial Logistic Regression model with a ridge estimator. Calls RWeka::Logistic() from RWeka.

Custom mlr3 parameters

  • output_debug_info:

    • original id: output-debug-info

  • do_not_check_capabilities:

    • original id: do-not-check-capabilities

  • num_decimal_places:

    • original id: num-decimal-places

  • batch_size:

    • original id: batch-size

  • Reason for change: This learner contains changed ids of the following control arguments since their ids contain irregular pattern

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.logistic")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, RWeka

Parameters

IdTypeDefaultLevelsRange
subsetuntyped--
na.actionuntyped--
ClogicalFALSETRUE, FALSE-
Rnumeric-\((-\infty, \infty)\)
Minteger-1\((-\infty, \infty)\)
output_debug_infologicalFALSETRUE, FALSE-
do_not_check_capabilitieslogicalFALSETRUE, FALSE-
num_decimal_placesinteger2\([1, \infty)\)
batch_sizeinteger100\([1, \infty)\)
optionsuntypedNULL-

References

le Cessie, S., van Houwelingen, J.C. (1992). “Ridge Estimators in Logistic Regression.” Applied Statistics, 41(1), 191-201.

See also

Author

damirpolat

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifLogistic

Active bindings

marshaled

(logical(1))
Whether the learner has been marshaled.

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method marshal()

Marshal the learner's model.

Usage

LearnerClassifLogistic$marshal(...)

Arguments

...

(any)
Additional arguments passed to mlr3::marshal_model().


Method unmarshal()

Unmarshal the learner's model.

Usage

LearnerClassifLogistic$unmarshal(...)

Arguments

...

(any)
Additional arguments passed to mlr3::unmarshal_model().


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifLogistic$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = lrn("classif.logistic")
print(learner)
#> 
#> ── <LearnerClassifLogistic> (classif.logistic): Multinomial Logistic Regression 
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3 and RWeka
#> • Predict Types: [response] and prob
#> • Feature Types: logical, integer, numeric, factor, and ordered
#> • Encapsulation: none (fallback: -)
#> • Properties: marshal, missings, multiclass, and twoclass
#> • Other settings: use_weights = 'error'

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> Logistic Regression with ridge parameter of 1.0E-8
#> Coefficients...
#>                               Class
#> Variable                          M
#> ===================================
#> V1                        1059.0288
#> V10                       -108.7145
#> V11                        181.1356
#> V12                         98.8765
#> V13                         12.0521
#> V14                         114.772
#> V15                         95.4795
#> V16                       -194.6353
#> V17                         -0.4462
#> V18                        -23.9461
#> V19                           9.041
#> V2                          307.259
#> V20                         90.1667
#> V21                        -75.4513
#> V22                         69.8777
#> V23                        -12.3094
#> V24                         93.5123
#> V25                        -33.2063
#> V26                       -153.2656
#> V27                         187.776
#> V28                       -136.9516
#> V29                          3.9929
#> V3                       -1621.5706
#> V30                        250.2559
#> V31                        -329.786
#> V32                        187.4445
#> V33                        -15.3184
#> V34                       -113.9027
#> V35                        183.0809
#> V36                       -249.6733
#> V37                        -55.8583
#> V38                        171.2915
#> V39                        -75.8105
#> V4                         563.0691
#> V40                        -44.9666
#> V41                         251.302
#> V42                       -323.4047
#> V43                         56.6058
#> V44                          74.687
#> V45                        153.7397
#> V46                        -94.0778
#> V47                         87.3742
#> V48                        353.9604
#> V49                        801.2069
#> V5                        -166.4687
#> V50                      -2371.1235
#> V51                       -1877.373
#> V52                       1712.8352
#> V53                        311.4261
#> V54                       1795.5182
#> V55                      -1669.4541
#> V56                      -2665.2051
#> V57                        603.3326
#> V58                       1572.3711
#> V59                        887.7586
#> V6                          220.291
#> V60                       1143.8062
#> V7                        -288.8639
#> V8                        -238.0804
#> V9                          51.8931
#> Intercept                  -61.0794
#> 
#> 
#> Odds Ratios...
#>                               Class
#> Variable                          M
#> ===================================
#> V1                         Infinity
#> V10                               0
#> V11            4.636727337684007E78
#> V12            8.739856012031617E42
#> V13                     171453.8082
#> V14             6.99604722008276E49
#> V15           2.9256424875544552E41
#> V16                               0
#> V17                          0.6401
#> V18                               0
#> V19                       8442.0536
#> V2            2.759941835344625E133
#> V20            1.441717160836593E39
#> V21                               0
#> V22           2.2257992505656803E30
#> V23                               0
#> V24           4.0915816772801257E40
#> V25                               0
#> V26                               0
#> V27           3.5487586462194502E81
#> V28                               0
#> V29                         54.2144
#> V3                                0
#> V30           4.838967435832729E108
#> V31                               0
#> V32            2.547363355652521E81
#> V33                               0
#> V34                               0
#> V35            3.243494393828782E79
#> V36                               0
#> V37                               0
#> V38           2.4600806376511764E74
#> V39                               0
#> V4             3.44984480955897E244
#> V40                               0
#> V41          1.3774063977737415E109
#> V42                               0
#> V43           3.8334013656729557E24
#> V44           2.7300295789076975E32
#> V45           5.8654594549147434E66
#> V46                               0
#> V47            8.833444557738186E37
#> V48          5.2847883764901934E153
#> V49                        Infinity
#> V5                                0
#> V50                               0
#> V51                               0
#> V52                        Infinity
#> V53           1.780918574973534E135
#> V54                        Infinity
#> V55                               0
#> V56                               0
#> V57          1.0568290416609578E262
#> V58                        Infinity
#> V59                        Infinity
#> V6            4.6898328462486743E95
#> V60                        Infinity
#> V7                                0
#> V8                                0
#> V9              3.44256245497909E22
#> 


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.3333333