Classification Logistic Regression Learner
Source:R/learner_RWeka_classif_logistic.R
mlr_learners_classif.logistic.RdMultinomial Logistic Regression model with a ridge estimator.
Calls RWeka::Logistic() from RWeka.
Custom mlr3 parameters
output_debug_info:original id: output-debug-info
do_not_check_capabilities:original id: do-not-check-capabilities
num_decimal_places:original id: num-decimal-places
batch_size:original id: batch-size
Reason for change: This learner contains changed ids of the following control arguments since their ids contain irregular pattern
Parameters
| Id | Type | Default | Levels | Range |
| subset | untyped | - | - | |
| na.action | untyped | - | - | |
| C | logical | FALSE | TRUE, FALSE | - |
| R | numeric | - | \((-\infty, \infty)\) | |
| M | integer | -1 | \((-\infty, \infty)\) | |
| output_debug_info | logical | FALSE | TRUE, FALSE | - |
| do_not_check_capabilities | logical | FALSE | TRUE, FALSE | - |
| num_decimal_places | integer | 2 | \([1, \infty)\) | |
| batch_size | integer | 100 | \([1, \infty)\) | |
| options | untyped | NULL | - |
References
le Cessie, S., van Houwelingen, J.C. (1992). “Ridge Estimators in Logistic Regression.” Applied Statistics, 41(1), 191-201.
See also
as.data.table(mlr_learners)for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/chapters/chapter2/data_and_basic_modeling.html#sec-learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifLogistic
Methods
Inherited methods
mlr3::Learner$base_learner()mlr3::Learner$configure()mlr3::Learner$encapsulate()mlr3::Learner$format()mlr3::Learner$help()mlr3::Learner$predict()mlr3::Learner$predict_newdata()mlr3::Learner$print()mlr3::Learner$reset()mlr3::Learner$selected_features()mlr3::Learner$train()mlr3::LearnerClassif$predict_newdata_fast()
Method marshal()
Marshal the learner's model.
Arguments
...(any)
Additional arguments passed tomlr3::marshal_model().
Method unmarshal()
Unmarshal the learner's model.
Arguments
...(any)
Additional arguments passed tomlr3::unmarshal_model().
Examples
# Define the Learner
learner = lrn("classif.logistic")
print(learner)
#>
#> ── <LearnerClassifLogistic> (classif.logistic): Multinomial Logistic Regression
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3 and RWeka
#> • Predict Types: [response] and prob
#> • Feature Types: logical, integer, numeric, factor, and ordered
#> • Encapsulation: none (fallback: -)
#> • Properties: marshal, missings, multiclass, and twoclass
#> • Other settings: use_weights = 'error', predict_raw = 'FALSE'
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#> Logistic Regression with ridge parameter of 1.0E-8
#> Coefficients...
#> Class
#> Variable M
#> ===================================
#> V1 2067.7019
#> V10 -291.2326
#> V11 209.7573
#> V12 245.9979
#> V13 -9.2206
#> V14 -159.0966
#> V15 65.5133
#> V16 28.1285
#> V17 -197.4535
#> V18 82.9586
#> V19 -21.4458
#> V2 -1920.8563
#> V20 118.5339
#> V21 -46.7056
#> V22 102.2084
#> V23 92.066
#> V24 -156.0515
#> V25 -61.8203
#> V26 104.1321
#> V27 -35.69
#> V28 33.5919
#> V29 -8.0104
#> V3 -246.7515
#> V30 111.6377
#> V31 -375.882
#> V32 356.7225
#> V33 -130.038
#> V34 -19.8856
#> V35 110.7636
#> V36 -80.2201
#> V37 -121.397
#> V38 -25.1399
#> V39 236.6145
#> V4 -180.5716
#> V40 -198.7167
#> V41 227.1095
#> V42 -153.4725
#> V43 64.5606
#> V44 -2.9194
#> V45 -103.8519
#> V46 99.9295
#> V47 3.8511
#> V48 482.0341
#> V49 859.8417
#> V5 125.579
#> V50 -3845.968
#> V51 3035.6374
#> V52 2453.5904
#> V53 476.6862
#> V54 -2277.5747
#> V55 796.7569
#> V56 -3477.9445
#> V57 367.5416
#> V58 2024.5115
#> V59 892.5338
#> V6 238.6971
#> V60 95.9574
#> V7 -506.9291
#> V8 31.0192
#> V9 165.3635
#> Intercept -79.1692
#>
#>
#> Odds Ratios...
#> Class
#> Variable M
#> ===================================
#> V1 Infinity
#> V10 0
#> V11 1.2486034808277316E91
#> V12 6.847330124487915E106
#> V13 0.0001
#> V14 0
#> V15 2.831672173542526E28
#> V16 1.6446473314763813E12
#> V17 0
#> V18 1.0677145466436496E36
#> V19 0
#> V2 0
#> V20 3.0104322694273176E51
#> V21 0
#> V22 2.446594874404362E44
#> V23 9.632659072382424E39
#> V24 0
#> V25 0
#> V26 1.6748511761098056E45
#> V27 0
#> V28 3.8793557075785206E14
#> V29 0.0003
#> V3 0
#> V30 3.045293863402155E48
#> V31 0
#> V32 8.367861302207982E154
#> V33 0
#> V34 0
#> V35 1.2706329224693956E48
#> V36 0
#> V37 0
#> V38 0
#> V39 5.759224034674328E102
#> V4 0
#> V40 0
#> V41 4.2894618752604016E98
#> V42 0
#> V43 1.0921874854389412E28
#> V44 0.054
#> V45 0
#> V46 2.50515993463591E43
#> V47 47.043
#> V48 2.211855504657181E209
#> V49 Infinity
#> V5 3.453669345675381E54
#> V50 0
#> V51 Infinity
#> V52 Infinity
#> V53 1.0524307841516362E207
#> V54 0
#> V55 Infinity
#> V56 0
#> V57 4.18114719017909E159
#> V58 Infinity
#> V59 Infinity
#> V6 4.6221540400152805E103
#> V60 4.71789372262155E41
#> V7 0
#> V8 2.961076463776754E13
#> V9 6.553155724722379E71
#>
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> classif.ce
#> 0.3043478