Adaptive best-subset selection for regression.
Calls abess::abess() from abess.
Parameters
| Id | Type | Default | Levels | Range |
| family | character | gaussian | gaussian, mgaussian, poisson, gamma | - |
| tune.path | character | sequence | sequence, gsection | - |
| tune.type | character | gic | gic, aic, bic, ebic, cv | - |
| normalize | integer | NULL | \((-\infty, \infty)\) | |
| support.size | untyped | NULL | - | |
| c.max | integer | 2 | \([1, \infty)\) | |
| gs.range | untyped | NULL | - | |
| lambda | numeric | 0 | \([0, \infty)\) | |
| always.include | untyped | NULL | - | |
| group.index | untyped | NULL | - | |
| init.active.set | untyped | NULL | - | |
| splicing.type | integer | 2 | \([1, 2]\) | |
| max.splicing.iter | integer | 20 | \([1, \infty)\) | |
| screening.num | integer | NULL | \([0, \infty)\) | |
| important.search | integer | NULL | \([0, \infty)\) | |
| warm.start | logical | TRUE | TRUE, FALSE | - |
| nfolds | integer | 5 | \((-\infty, \infty)\) | |
| foldid | untyped | NULL | - | |
| cov.update | logical | FALSE | TRUE, FALSE | - |
| newton | character | exact | exact, approx | - |
| newton.thresh | numeric | 1e-06 | \([0, \infty)\) | |
| max.newton.iter | integer | NULL | \([1, \infty)\) | |
| early.stop | logical | FALSE | TRUE, FALSE | - |
| ic.scale | numeric | 1 | \([0, \infty)\) | |
| num.threads | integer | 0 | \([0, \infty)\) | |
| seed | integer | 1 | \((-\infty, \infty)\) |
Initial parameter values
num.threads: This parameter is initialized to 1 (default is 0) to avoid conflicts with the mlr3 parallelization.family: Depends on the task type, if the parameterfamilyisNULL, it is set to"binomial"for binary classification tasks and to"multinomial"for multiclass classification problems.
See also
as.data.table(mlr_learners)for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/basics.html#learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrAbess
Methods
Inherited methods
Method selected_features()
Extract the name of selected features from the model by abess::extract().
Examples
# Define the Learner
learner = lrn("regr.abess")
print(learner)
#>
#> ── <LearnerRegrAbess> (regr.abess): Fast Best Subset Selection for Regression ──
#> • Model: -
#> • Parameters: num.threads=1
#> • Packages: mlr3 and abess
#> • Predict Types: [response]
#> • Feature Types: integer and numeric
#> • Encapsulation: none (fallback: -)
#> • Properties: selected_features and weights
#> • Other settings: use_weights = 'use'
# Define a Task
task = tsk("mtcars")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#> Call:
#> abess.default(x = task$data(cols = task$feature_names), y = as.matrix(task$data(cols = task$target_names)),
#> num.threads = 1L)
#>
#> support.size dev GIC
#> 1 0 16.664104 59.07840
#> 2 1 2.917828 25.05120
#> 3 2 1.547203 14.29257
#> 4 3 1.407604 14.87038
#> 5 4 1.325287 16.16848
#> 6 5 1.312149 18.52284
#> 7 6 1.300120 20.89300
#> 8 7 1.292767 23.33747
#> 9 8 1.291267 25.87665
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> regr.mse
#> 12.53691