Skip to contents

Adaptive best-subset selection for classification. Calls abess::abess() from abess.

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.abess")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “integer”, “numeric”

  • Required Packages: mlr3, abess

Parameters

IdTypeDefaultLevelsRange
familycharacter-binomial, multinomial, ordinal-
tune.pathcharactersequencesequence, gsection-
tune.typecharactergicgic, aic, bic, ebic, cv-
normalizeintegerNULL\((-\infty, \infty)\)
support.sizeuntypedNULL-
c.maxinteger2\([1, \infty)\)
gs.rangeuntypedNULL-
lambdanumeric0\([0, \infty)\)
always.includeuntypedNULL-
group.indexuntypedNULL-
init.active.setuntypedNULL-
splicing.typeinteger2\([1, 2]\)
max.splicing.iterinteger20\([1, \infty)\)
screening.numintegerNULL\([0, \infty)\)
important.searchintegerNULL\([0, \infty)\)
warm.startlogicalTRUETRUE, FALSE-
nfoldsinteger5\((-\infty, \infty)\)
foldiduntypedNULL-
cov.updatelogicalFALSETRUE, FALSE-
newtoncharacterexactexact, approx-
newton.threshnumeric1e-06\([0, \infty)\)
max.newton.iterintegerNULL\([1, \infty)\)
early.stoplogicalFALSETRUE, FALSE-
ic.scalenumeric1\([0, \infty)\)
num.threadsinteger0\([0, \infty)\)
seedinteger0\((-\infty, \infty)\)

Initial parameter values

  • num.threads: This parameter is initialized to 1 (default is 0) to avoid conflicts with the mlr3 parallelization.

  • family: Depends on the task type, if the parameter family is NULL, it is set to "binomial" for binary classification tasks and to "multinomial" for multiclass classification problems.

See also

Author

abess-team

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifAbess

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method selected_features()

Extract the name of selected features from the model by abess::extract().

Usage

LearnerClassifAbess$selected_features()

Returns

The names of selected features


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifAbess$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = mlr3::lrn("classif.abess")
print(learner)
#> <LearnerClassifAbess:classif.abess>: Fast Best Subset Selection for Classification
#> * Model: -
#> * Parameters: num.threads=1
#> * Packages: mlr3, abess
#> * Predict Types:  [response], prob
#> * Feature Types: integer, numeric
#> * Properties: multiclass, selected_features, twoclass, weights

# Define a Task
task = mlr3::tsk("sonar")

# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> Call:
#> abess.default(x = x, y = y, family = "binomial", num.threads = 1L)
#> 
#>    support.size      dev      GIC
#> 1             0 96.31508 192.6302
#> 2             1 77.77817 162.0919
#> 3             2 71.52497 156.1211
#> 4             3 64.58873 148.7842
#> 5             4 60.32146 146.7852
#> 6             5 57.22205 147.1220
#> 7             6 54.25590 147.7253
#> 8             7 50.42646 146.6020
#> 9             8 47.21606 146.7168
#> 10            9 44.17528 147.1708
#> 11           10 39.75582 144.8674
#> 12           11 37.07486 146.0411
#> 13           12 35.28734 149.0017
#> 14           13 34.09136 153.1453
#> 15           14 32.93729 157.3727
#> 16           15 31.79613 161.6260
#> 17           16 30.60256 165.7744
#> 18           17 21.79038 154.6857
#> 19           18 18.43008 154.5006
#> 20           19 15.67258 155.5212
#> 21           20 13.77575 158.2631
#> 22           21 12.02102 161.2892


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.2753623