Adaptive best-subset selection for classification.
Calls abess::abess() from abess.
Parameters
| Id | Type | Default | Levels | Range |
| family | character | - | binomial, multinomial, ordinal | - |
| tune.path | character | sequence | sequence, gsection | - |
| tune.type | character | gic | gic, aic, bic, ebic, cv | - |
| normalize | integer | NULL | \((-\infty, \infty)\) | |
| support.size | untyped | NULL | - | |
| c.max | integer | 2 | \([1, \infty)\) | |
| gs.range | untyped | NULL | - | |
| lambda | numeric | 0 | \([0, \infty)\) | |
| always.include | untyped | NULL | - | |
| group.index | untyped | NULL | - | |
| init.active.set | untyped | NULL | - | |
| splicing.type | integer | 2 | \([1, 2]\) | |
| max.splicing.iter | integer | 20 | \([1, \infty)\) | |
| screening.num | integer | NULL | \([0, \infty)\) | |
| important.search | integer | NULL | \([0, \infty)\) | |
| warm.start | logical | TRUE | TRUE, FALSE | - |
| nfolds | integer | 5 | \((-\infty, \infty)\) | |
| foldid | untyped | NULL | - | |
| cov.update | logical | FALSE | TRUE, FALSE | - |
| newton | character | exact | exact, approx | - |
| newton.thresh | numeric | 1e-06 | \([0, \infty)\) | |
| max.newton.iter | integer | NULL | \([1, \infty)\) | |
| early.stop | logical | FALSE | TRUE, FALSE | - |
| ic.scale | numeric | 1 | \([0, \infty)\) | |
| num.threads | integer | 0 | \([0, \infty)\) | |
| seed | integer | 0 | \((-\infty, \infty)\) |
Initial parameter values
num.threads: This parameter is initialized to 1 (default is 0) to avoid conflicts with the mlr3 parallelization.family: Depends on the task type, if the parameterfamilyisNULL, it is set to"binomial"for binary classification tasks and to"multinomial"for multiclass classification problems.
See also
as.data.table(mlr_learners)for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/basics.html#learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifAbess
Methods
Inherited methods
Method selected_features()
Extract the name of selected features from the model by abess::extract().
Examples
# Define the Learner
learner = lrn("classif.abess")
print(learner)
#>
#> ── <LearnerClassifAbess> (classif.abess): Fast Best Subset Selection for Classif
#> • Model: -
#> • Parameters: num.threads=1
#> • Packages: mlr3 and abess
#> • Predict Types: [response] and prob
#> • Feature Types: integer and numeric
#> • Encapsulation: none (fallback: -)
#> • Properties: multiclass, selected_features, twoclass, and weights
#> • Other settings: use_weights = 'use'
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#> Call:
#> abess.default(x = x, y = y, family = "binomial", num.threads = 1L)
#>
#> support.size dev GIC
#> 1 0 96.31508 192.6302
#> 2 1 77.77817 162.0919
#> 3 2 71.52497 156.1211
#> 4 3 64.58873 148.7842
#> 5 4 60.32146 146.7852
#> 6 5 57.22205 147.1220
#> 7 6 54.25590 147.7253
#> 8 7 50.42646 146.6020
#> 9 8 47.21606 146.7168
#> 10 9 44.17528 147.1708
#> 11 10 39.75582 144.8674
#> 12 11 37.07486 146.0411
#> 13 12 35.28734 149.0017
#> 14 13 34.09136 153.1453
#> 15 14 32.93729 157.3727
#> 16 15 31.79613 161.6260
#> 17 16 30.60256 165.7744
#> 18 17 21.79038 154.6857
#> 19 18 18.43008 154.5006
#> 20 19 15.67258 155.5212
#> 21 20 13.77575 158.2631
#> 22 21 12.02102 161.2892
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> classif.ce
#> 0.2753623