Classification Logistic Regression Learner
Source:R/learner_RWeka_classif_logistic.R
mlr_learners_classif.logistic.RdMultinomial Logistic Regression model with a ridge estimator.
Calls RWeka::Logistic() from RWeka.
Custom mlr3 parameters
output_debug_info:original id: output-debug-info
do_not_check_capabilities:original id: do-not-check-capabilities
num_decimal_places:original id: num-decimal-places
batch_size:original id: batch-size
Reason for change: This learner contains changed ids of the following control arguments since their ids contain irregular pattern
Parameters
| Id | Type | Default | Levels | Range |
| subset | untyped | - | - | |
| na.action | untyped | - | - | |
| C | logical | FALSE | TRUE, FALSE | - |
| R | numeric | - | \((-\infty, \infty)\) | |
| M | integer | -1 | \((-\infty, \infty)\) | |
| output_debug_info | logical | FALSE | TRUE, FALSE | - |
| do_not_check_capabilities | logical | FALSE | TRUE, FALSE | - |
| num_decimal_places | integer | 2 | \([1, \infty)\) | |
| batch_size | integer | 100 | \([1, \infty)\) | |
| options | untyped | NULL | - |
References
le Cessie, S., van Houwelingen, J.C. (1992). “Ridge Estimators in Logistic Regression.” Applied Statistics, 41(1), 191-201.
See also
as.data.table(mlr_learners)for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/chapters/chapter2/data_and_basic_modeling.html#sec-learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifLogistic
Methods
Inherited methods
mlr3::Learner$base_learner()mlr3::Learner$configure()mlr3::Learner$encapsulate()mlr3::Learner$format()mlr3::Learner$help()mlr3::Learner$predict()mlr3::Learner$predict_newdata()mlr3::Learner$print()mlr3::Learner$reset()mlr3::Learner$selected_features()mlr3::Learner$train()mlr3::LearnerClassif$predict_newdata_fast()
Method marshal()
Marshal the learner's model.
Arguments
...(any)
Additional arguments passed tomlr3::marshal_model().
Method unmarshal()
Unmarshal the learner's model.
Arguments
...(any)
Additional arguments passed tomlr3::unmarshal_model().
Examples
# Define the Learner
learner = lrn("classif.logistic")
print(learner)
#>
#> ── <LearnerClassifLogistic> (classif.logistic): Multinomial Logistic Regression
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3 and RWeka
#> • Predict Types: [response] and prob
#> • Feature Types: logical, integer, numeric, factor, and ordered
#> • Encapsulation: none (fallback: -)
#> • Properties: marshal, missings, multiclass, and twoclass
#> • Other settings: use_weights = 'error', predict_raw = 'FALSE'
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#> Logistic Regression with ridge parameter of 1.0E-8
#> Coefficients...
#> Class
#> Variable M
#> ===================================
#> V1 -15.0173
#> V10 35.3765
#> V11 59.4036
#> V12 157.4884
#> V13 83.4928
#> V14 -29.8788
#> V15 93.7312
#> V16 -248.6971
#> V17 100.6152
#> V18 -86.0169
#> V19 25.4271
#> V2 277.1969
#> V20 -34.2741
#> V21 156.9219
#> V22 -67.2525
#> V23 22.1292
#> V24 207.9149
#> V25 -221.1716
#> V26 -78.4167
#> V27 156.4099
#> V28 -40.9321
#> V29 -151.8363
#> V3 -863.6913
#> V30 262.4607
#> V31 -182.6415
#> V32 36.9662
#> V33 23.8477
#> V34 9.558
#> V35 -62.8719
#> V36 38.3125
#> V37 -170.8255
#> V38 32.7974
#> V39 132.6521
#> V4 48.2027
#> V40 -194.6409
#> V41 293.1782
#> V42 -344.6577
#> V43 143.1297
#> V44 120.6617
#> V45 -143.7411
#> V46 303.5421
#> V47 91.6424
#> V48 221.8321
#> V49 522.5231
#> V5 235.6034
#> V50 -3657.6755
#> V51 1834.2913
#> V52 1050.0561
#> V53 1674.7927
#> V54 -1115.9107
#> V55 -2391.1744
#> V56 1411.501
#> V57 -1972.4708
#> V58 2232.4711
#> V59 -52.846
#> V6 251.2167
#> V60 99.2253
#> V7 -457.5583
#> V8 164.9278
#> V9 -97.8668
#> Intercept -56.9787
#>
#>
#> Odds Ratios...
#> Class
#> Variable M
#> ===================================
#> V1 0
#> V10 2.3110785144760005E15
#> V11 6.290143393908881E25
#> V12 2.4907567062076115E68
#> V13 1.8215564740024585E36
#> V14 0
#> V15 5.092463832246585E40
#> V16 0
#> V17 4.973090236934481E43
#> V18 0
#> V19 1.1036659509247987E11
#> V2 2.4271863398688175E120
#> V20 0
#> V21 1.4135759740952378E68
#> V22 0
#> V23 4079444345.5893
#> V24 1.978392336313887E90
#> V25 0
#> V26 0
#> V27 8.471364227565564E67
#> V28 0
#> V29 0
#> V3 0
#> V30 9.665524590858832E113
#> V31 0
#> V32 1.1329556204673134E16
#> V33 2.274657936719465E10
#> V34 14157.7733
#> V35 0
#> V36 4.3543362134123736E16
#> V37 0
#> V38 1.7528353295965162E14
#> V39 4.074647311372766E57
#> V4 8.593516656936685E20
#> V40 0
#> V41 2.11682355935115E127
#> V42 0
#> V43 1.4468572634048977E62
#> V44 2.5276588635717647E52
#> V45 0
#> V46 6.709177313171527E131
#> V47 6.30671338480046E39
#> V48 2.189987896283361E96
#> V49 8.489832503542987E226
#> V5 2.095264947190743E102
#> V50 0
#> V51 Infinity
#> V52 Infinity
#> V53 Infinity
#> V54 0
#> V55 0
#> V56 Infinity
#> V57 0
#> V58 Infinity
#> V59 0
#> V6 1.2648547608169624E109
#> V60 1.2388147829819234E43
#> V7 0
#> V8 4.2389128981265793E71
#> V9 0
#>
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> classif.ce
#> 0.2318841