Classification Logistic Regression Learner
Source:R/learner_RWeka_classif_logistic.R
mlr_learners_classif.logistic.RdMultinomial Logistic Regression model with a ridge estimator.
Calls RWeka::Logistic() from RWeka.
Custom mlr3 parameters
output_debug_info:original id: output-debug-info
do_not_check_capabilities:original id: do-not-check-capabilities
num_decimal_places:original id: num-decimal-places
batch_size:original id: batch-size
Reason for change: This learner contains changed ids of the following control arguments since their ids contain irregular pattern
Parameters
| Id | Type | Default | Levels | Range |
| subset | untyped | - | - | |
| na.action | untyped | - | - | |
| C | logical | FALSE | TRUE, FALSE | - |
| R | numeric | - | \((-\infty, \infty)\) | |
| M | integer | -1 | \((-\infty, \infty)\) | |
| output_debug_info | logical | FALSE | TRUE, FALSE | - |
| do_not_check_capabilities | logical | FALSE | TRUE, FALSE | - |
| num_decimal_places | integer | 2 | \([1, \infty)\) | |
| batch_size | integer | 100 | \([1, \infty)\) | |
| options | untyped | NULL | - |
References
le Cessie, S., van Houwelingen, J.C. (1992). “Ridge Estimators in Logistic Regression.” Applied Statistics, 41(1), 191-201.
See also
as.data.table(mlr_learners)for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/basics.html#learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifLogistic
Methods
Inherited methods
mlr3::Learner$base_learner()mlr3::Learner$configure()mlr3::Learner$encapsulate()mlr3::Learner$format()mlr3::Learner$help()mlr3::Learner$predict()mlr3::Learner$predict_newdata()mlr3::Learner$print()mlr3::Learner$reset()mlr3::Learner$selected_features()mlr3::Learner$train()mlr3::LearnerClassif$predict_newdata_fast()
Method marshal()
Marshal the learner's model.
Arguments
...(any)
Additional arguments passed tomlr3::marshal_model().
Method unmarshal()
Unmarshal the learner's model.
Arguments
...(any)
Additional arguments passed tomlr3::unmarshal_model().
Examples
# Define the Learner
learner = lrn("classif.logistic")
print(learner)
#>
#> ── <LearnerClassifLogistic> (classif.logistic): Multinomial Logistic Regression
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3 and RWeka
#> • Predict Types: [response] and prob
#> • Feature Types: logical, integer, numeric, factor, and ordered
#> • Encapsulation: none (fallback: -)
#> • Properties: marshal, missings, multiclass, and twoclass
#> • Other settings: use_weights = 'error'
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#> Logistic Regression with ridge parameter of 1.0E-8
#> Coefficients...
#> Class
#> Variable M
#> ===================================
#> V1 1232.0287
#> V10 25.4029
#> V11 -159.9746
#> V12 323.0673
#> V13 110.8439
#> V14 -325.7168
#> V15 130.9279
#> V16 -199.6705
#> V17 63.5731
#> V18 -69.1805
#> V19 -43.4496
#> V2 -18.3807
#> V20 208.7359
#> V21 -113.8461
#> V22 -33.0447
#> V23 179.0744
#> V24 -17.9253
#> V25 -86.8181
#> V26 45.7317
#> V27 -66.5662
#> V28 -40.8248
#> V29 87.7109
#> V3 -988.5348
#> V30 139.3134
#> V31 -145.1779
#> V32 21.7819
#> V33 55.2479
#> V34 -160.2003
#> V35 78.8094
#> V36 -56.7109
#> V37 -94.7205
#> V38 91.5543
#> V39 -56.092
#> V4 391.5958
#> V40 88.7719
#> V41 12.1419
#> V42 31.2454
#> V43 -187.948
#> V44 217.3965
#> V45 -268.6784
#> V46 335.6118
#> V47 23.2577
#> V48 -284.3528
#> V49 1018.2124
#> V5 77.0269
#> V50 -2154.4302
#> V51 737.3331
#> V52 3632.4399
#> V53 -1136.3167
#> V54 2153.7548
#> V55 -3073.4461
#> V56 2651.9097
#> V57 -1218.4665
#> V58 2026.2734
#> V59 -992.828
#> V6 498.6106
#> V60 1642.1887
#> V7 -257.2177
#> V8 -356.2561
#> V9 244.9068
#> Intercept -106.5824
#>
#>
#> Odds Ratios...
#> Class
#> Variable M
#> ===================================
#> V1 Infinity
#> V10 1.0773542817310812E11
#> V11 0
#> V12 2.0247156216558968E140
#> V13 1.3768242217213094E48
#> V14 0
#> V15 7.265724981295653E56
#> V16 0
#> V17 4.0687256932123374E27
#> V18 0
#> V19 0
#> V2 0
#> V20 4.496057264855042E90
#> V21 0
#> V22 0
#> V23 5.9024812821178E77
#> V24 0
#> V25 0
#> V26 7.261176171070131E19
#> V27 0
#> V28 0
#> V29 1.2370259230365187E38
#> V3 0
#> V30 3.1844163551378414E60
#> V31 0
#> V32 2882555925.4922
#> V33 9.859223361324271E23
#> V34 0
#> V35 1.6844883791558554E34
#> V36 0
#> V37 0
#> V38 5.774760231551128E39
#> V39 0
#> V4 1.1692001010925913E170
#> V40 3.574032160314959E38
#> V41 187565.7155
#> V42 3.712809884183518E13
#> V43 0
#> V44 2.594821192841296E94
#> V45 0
#> V46 5.68008799586112E145
#> V47 1.260902901354616E10
#> V48 0
#> V49 Infinity
#> V5 2.833863163825333E33
#> V50 0
#> V51 Infinity
#> V52 Infinity
#> V53 0
#> V54 Infinity
#> V55 0
#> V56 Infinity
#> V57 0
#> V58 Infinity
#> V59 0
#> V6 3.498079157194344E216
#> V60 Infinity
#> V7 0
#> V8 0
#> V9 2.2997469478028706E106
#>
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> classif.ce
#> 0.2318841