Skip to contents

Multinomial Logistic Regression model with a ridge estimator. Calls RWeka::Logistic() from RWeka.

Custom mlr3 parameters

  • output_debug_info:

    • original id: output-debug-info

  • do_not_check_capabilities:

    • original id: do-not-check-capabilities

  • num_decimal_places:

    • original id: num-decimal-places

  • batch_size:

    • original id: batch-size

  • Reason for change: This learner contains changed ids of the following control arguments since their ids contain irregular pattern

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.logistic")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, RWeka

Parameters

IdTypeDefaultLevelsRange
subsetuntyped--
na.actionuntyped--
ClogicalFALSETRUE, FALSE-
Rnumeric-\((-\infty, \infty)\)
Minteger-1\((-\infty, \infty)\)
output_debug_infologicalFALSETRUE, FALSE-
do_not_check_capabilitieslogicalFALSETRUE, FALSE-
num_decimal_placesinteger2\([1, \infty)\)
batch_sizeinteger100\([1, \infty)\)
optionsuntypedNULL-

References

le Cessie, S., van Houwelingen, J.C. (1992). “Ridge Estimators in Logistic Regression.” Applied Statistics, 41(1), 191-201.

See also

Author

damirpolat

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifLogistic

Active bindings

marshaled

(logical(1))
Whether the learner has been marshaled.

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method marshal()

Marshal the learner's model.

Usage

LearnerClassifLogistic$marshal(...)

Arguments

...

(any)
Additional arguments passed to mlr3::marshal_model().


Method unmarshal()

Unmarshal the learner's model.

Usage

LearnerClassifLogistic$unmarshal(...)

Arguments

...

(any)
Additional arguments passed to mlr3::unmarshal_model().


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifLogistic$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = lrn("classif.logistic")
print(learner)
#> 
#> ── <LearnerClassifLogistic> (classif.logistic): Multinomial Logistic Regression 
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3 and RWeka
#> • Predict Types: [response] and prob
#> • Feature Types: logical, integer, numeric, factor, and ordered
#> • Encapsulation: none (fallback: -)
#> • Properties: marshal, missings, multiclass, and twoclass
#> • Other settings: use_weights = 'error'

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> Logistic Regression with ridge parameter of 1.0E-8
#> Coefficients...
#>                               Class
#> Variable                          M
#> ===================================
#> V1                         754.0393
#> V10                        -34.5664
#> V11                        -46.1694
#> V12                        221.8324
#> V13                        -32.9615
#> V14                        -110.494
#> V15                        342.3106
#> V16                       -323.7676
#> V17                        -38.7115
#> V18                         63.2236
#> V19                         19.0392
#> V2                         374.7173
#> V20                         58.0559
#> V21                       -131.5702
#> V22                        137.6453
#> V23                         -3.4297
#> V24                        -65.9397
#> V25                        169.1659
#> V26                       -197.8882
#> V27                          88.239
#> V28                         20.9816
#> V29                       -137.6524
#> V3                       -1084.0678
#> V30                        228.6977
#> V31                       -206.2764
#> V32                        -46.6927
#> V33                        122.4121
#> V34                        -218.699
#> V35                        201.4707
#> V36                       -158.1521
#> V37                         44.0858
#> V38                         18.8432
#> V39                         61.3163
#> V4                         199.7165
#> V40                       -148.1571
#> V41                         61.2143
#> V42                           -33.8
#> V43                          1.8187
#> V44                        -17.9925
#> V45                        340.5175
#> V46                       -126.0779
#> V47                        244.4778
#> V48                        -26.5862
#> V49                        346.9894
#> V5                         118.8147
#> V50                      -1820.8351
#> V51                       -935.1447
#> V52                       3032.4364
#> V53                        857.2196
#> V54                       1884.3235
#> V55                      -2977.5808
#> V56                       1458.5842
#> V57                        360.6232
#> V58                       1552.1643
#> V59                       -2041.729
#> V6                        -145.4042
#> V60                      -1115.1768
#> V7                         -298.316
#> V8                          49.6141
#> V9                         135.6714
#> Intercept                  -17.2596
#> 
#> 
#> Odds Ratios...
#>                               Class
#> Variable                          M
#> ===================================
#> V1                         Infinity
#> V10                               0
#> V11                               0
#> V12           2.1907191156485557E96
#> V13                               0
#> V14                               0
#> V15          4.6090571992408665E148
#> V16                               0
#> V17                               0
#> V18           2.8685668255476666E27
#> V19                  185615544.2046
#> V2            5.465665112574876E162
#> V20            1.634431610803804E25
#> V21                               0
#> V22            6.006156857939386E59
#> V23                          0.0324
#> V24                               0
#> V25            2.936354419558556E73
#> V26                               0
#> V27           2.0975965191445965E38
#> V28                 1294779202.2421
#> V29                               0
#> V3                                0
#> V30           2.0996305741010197E99
#> V31                               0
#> V32                               0
#> V33           1.4550853541234666E53
#> V34                               0
#> V35            3.145066679670405E87
#> V36                               0
#> V37            1.400363381072458E19
#> V38                  152582338.7419
#> V39            4.259337509876384E26
#> V4             5.442185149792072E86
#> V40                               0
#> V41           3.8462075129166284E26
#> V42                               0
#> V43                          6.1639
#> V44                               0
#> V45            7.67138313993939E147
#> V46                               0
#> V47          1.4975401344004717E106
#> V48                               0
#> V49           4.961280686115532E150
#> V5             3.986298643514246E51
#> V50                               0
#> V51                               0
#> V52                        Infinity
#> V53                        Infinity
#> V54                        Infinity
#> V55                               0
#> V56                        Infinity
#> V57           4.136848124632892E156
#> V58                        Infinity
#> V59                               0
#> V6                                0
#> V60                               0
#> V7                                0
#> V8               3.5247699852006E21
#> V9             8.343038490551579E58
#> 


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.3188406