Skip to contents

Multinomial Logistic Regression model with a ridge estimator. Calls RWeka::Logistic() from RWeka.

Custom mlr3 parameters

  • output_debug_info:

    • original id: output-debug-info

  • do_not_check_capabilities:

    • original id: do-not-check-capabilities

  • num_decimal_places:

    • original id: num-decimal-places

  • batch_size:

    • original id: batch-size

  • Reason for change: This learner contains changed ids of the following control arguments since their ids contain irregular pattern

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.logistic")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, RWeka

Parameters

IdTypeDefaultLevelsRange
subsetuntyped--
na.actionuntyped--
ClogicalFALSETRUE, FALSE-
Rnumeric-\((-\infty, \infty)\)
Minteger-1\((-\infty, \infty)\)
output_debug_infologicalFALSETRUE, FALSE-
do_not_check_capabilitieslogicalFALSETRUE, FALSE-
num_decimal_placesinteger2\([1, \infty)\)
batch_sizeinteger100\([1, \infty)\)
optionsuntypedNULL-

References

le Cessie, S., van Houwelingen, J.C. (1992). “Ridge Estimators in Logistic Regression.” Applied Statistics, 41(1), 191-201.

See also

Author

damirpolat

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifLogistic

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifLogistic$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = mlr3::lrn("classif.logistic")
print(learner)
#> <LearnerClassifLogistic:classif.logistic>: Multinomial Logistic Regression
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, RWeka
#> * Predict Types:  [response], prob
#> * Feature Types: logical, integer, numeric, factor, ordered
#> * Properties: missings, multiclass, twoclass

# Define a Task
task = mlr3::tsk("sonar")

# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> Logistic Regression with ridge parameter of 1.0E-8
#> Coefficients...
#>                               Class
#> Variable                          M
#> ===================================
#> V1                        -230.2221
#> V10                        -135.151
#> V11                          48.824
#> V12                        417.2917
#> V13                       -152.1915
#> V14                        182.7414
#> V15                       -169.1868
#> V16                         68.2488
#> V17                        -92.1824
#> V18                         45.0013
#> V19                        -65.0617
#> V2                        1457.7774
#> V20                         77.7137
#> V21                       -137.3261
#> V22                        156.4719
#> V23                         57.7409
#> V24                         66.5773
#> V25                         25.4364
#> V26                       -229.8009
#> V27                        104.1531
#> V28                          6.5144
#> V29                        -98.2244
#> V3                        -879.1599
#> V30                        354.8184
#> V31                       -438.4708
#> V32                        171.8025
#> V33                         84.6273
#> V34                       -300.9054
#> V35                        250.9485
#> V36                         17.2276
#> V37                       -245.9505
#> V38                         83.1282
#> V39                        -25.1224
#> V4                         200.4187
#> V40                         -1.3483
#> V41                        -64.2081
#> V42                       -104.7823
#> V43                         71.2164
#> V44                         39.5982
#> V45                         -22.403
#> V46                        -30.6689
#> V47                        602.3224
#> V48                        -20.3071
#> V49                        369.1374
#> V5                        -175.6671
#> V50                      -2120.9226
#> V51                        817.6114
#> V52                        891.4774
#> V53                       2657.2073
#> V54                       -451.4863
#> V55                         50.0191
#> V56                      -1322.0291
#> V57                       -409.7399
#> V58                         215.111
#> V59                       1783.5366
#> V6                         241.9771
#> V60                        1101.155
#> V7                        -265.1094
#> V8                         -343.138
#> V9                         332.3405
#> Intercept                  -102.581
#> 
#> 
#> Odds Ratios...
#>                               Class
#> Variable                          M
#> ===================================
#> V1                                0
#> V10                               0
#> V11           1.5995043727542203E21
#> V12          1.6883624222399055E181
#> V13                               0
#> V14           2.3099139338446317E79
#> V15                               0
#> V16            4.365833470820818E29
#> V17                               0
#> V18            3.498065809738885E19
#> V19                               0
#> V2                         Infinity
#> V20            5.631750650328867E33
#> V21                               0
#> V22            9.013669133223818E67
#> V23           1.1927850060905918E25
#> V24            8.206093358480686E28
#> V25           1.1140144013306358E11
#> V26                               0
#> V27           1.7104669322396844E45
#> V28                        674.8011
#> V29                               0
#> V3                                0
#> V30          1.2464098212107336E154
#> V31                               0
#> V32           4.1009983204527753E74
#> V33            5.664794378502789E36
#> V34                               0
#> V35           9.672783471834669E108
#> V36                   30329599.8345
#> V37                               0
#> V38           1.2650266291222363E36
#> V39                               0
#> V4            1.0983025008485249E87
#> V40                          0.2597
#> V41                               0
#> V42                               0
#> V43            8.490009106706466E30
#> V44          1.57506850820602656E17
#> V45                               0
#> V46                               0
#> V47          3.8486874692045793E261
#> V48                               0
#> V49          2.0623265929459407E160
#> V5                                0
#> V50                               0
#> V51                        Infinity
#> V52                        Infinity
#> V53                        Infinity
#> V54                               0
#> V55             5.28462809468674E21
#> V56                               0
#> V57                               0
#> V58           2.6395170780400906E93
#> V59                        Infinity
#> V6           1.2283710795012565E105
#> V60                        Infinity
#> V7                                0
#> V8                                0
#> V9           2.1559485542623307E144
#> 


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.2318841