Classification using ada.
Calls ada::ada() from ada.
Meta Information
Task type: “classif”
Predict Types: “response”, “prob”
Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”
Required Packages: mlr3, mlr3extralearners, ada, rpart
Parameters
| Id | Type | Default | Levels | Range |
| bag.frac | numeric | 0.5 | \([0, 1]\) | |
| bag.shift | logical | FALSE | TRUE, FALSE | - |
| cp | numeric | 0.01 | \([0, 1]\) | |
| delta | numeric | 1e-10 | \([0, \infty)\) | |
| iter | integer | 50 | \([1, \infty)\) | |
| loss | character | exponential | exponential, logistic | - |
| max.iter | integer | 20 | \([1, \infty)\) | |
| maxcompete | integer | 4 | \([0, \infty)\) | |
| maxdepth | integer | 30 | \([1, 30]\) | |
| maxsurrogate | integer | 5 | \([0, \infty)\) | |
| minbucket | integer | - | \([1, \infty)\) | |
| minsplit | integer | 20 | \([1, \infty)\) | |
| model.coef | logical | TRUE | TRUE, FALSE | - |
| n.iter | integer | 50 | \([1, \infty)\) | |
| nu | numeric | 0.1 | \([0, \infty)\) | |
| surrogatestyle | integer | 0 | \([0, 1]\) | |
| type | character | discrete | discrete, real, gentle | - |
| usesurrogate | integer | 2 | \([0, 2]\) | |
| verbose | logical | FALSE | TRUE, FALSE | - |
| xval | integer | 0 | \([0, \infty)\) |
See also
as.data.table(mlr_learners)for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/basics.html#learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifAdaBoosting
Methods
Inherited methods
mlr3::Learner$base_learner()mlr3::Learner$configure()mlr3::Learner$encapsulate()mlr3::Learner$format()mlr3::Learner$help()mlr3::Learner$predict()mlr3::Learner$predict_newdata()mlr3::Learner$print()mlr3::Learner$reset()mlr3::Learner$selected_features()mlr3::Learner$train()mlr3::LearnerClassif$predict_newdata_fast()
Method oob_error()
The oob error is extracted extracted from the model slot $model$errs.
Returns
Named numeric().
Examples
# Define the Learner
learner = lrn("classif.ada")
print(learner)
#>
#> ── <LearnerClassifAdaBoosting> (classif.ada): ada Boosting ─────────────────────
#> • Model: -
#> • Parameters: xval=0
#> • Packages: mlr3, mlr3extralearners, ada, and rpart
#> • Predict Types: [response] and prob
#> • Feature Types: logical, integer, numeric, factor, and ordered
#> • Encapsulation: none (fallback: -)
#> • Properties: missings, oob_error, and twoclass
#> • Other settings: use_weights = 'error'
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#> Call:
#> ada(task$formula(), data = task$data(), control = list(xval = 0L))
#>
#> Loss: exponential Method: discrete Iteration: 50
#>
#> Final Confusion Matrix for Data:
#> Final Prediction
#> True value M R
#> M 76 1
#> R 3 59
#>
#> Train Error: 0.029
#>
#> Out-Of-Bag Error: 0.043 iteration= 50
#>
#> Additional Estimates of number of iterations:
#>
#> train.err1 train.kap1
#> 39 39
#>
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> classif.ce
#> 0.1449275