Partial least squares regression.
Calls pls::plsr() from pls.
Parameters
| Id | Type | Default | Levels | Range |
| ncomp | integer | - | \([1, \infty)\) | |
| method | character | kernelpls | kernelpls, widekernelpls, simpls, oscorespls | - |
| scale | logical | TRUE | TRUE, FALSE | - |
| center | logical | TRUE | TRUE, FALSE | - |
| validation | character | none | none, CV, LOO | - |
| model | logical | TRUE | TRUE, FALSE | - |
| x | logical | FALSE | TRUE, FALSE | - |
| y | logical | FALSE | TRUE, FALSE | - |
References
Mevik, Bjorn-Helge, Wehrens, Ron (2007). “The pls Package: Principal Component and Partial Least Squares Regression in R.” Journal of Statistical Software, 18(2), 1–24. doi:10.18637/jss.v018.i02 .
See also
as.data.table(mlr_learners)for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/basics.html#learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrPlsr
Methods
Inherited methods
mlr3::Learner$base_learner()mlr3::Learner$configure()mlr3::Learner$encapsulate()mlr3::Learner$format()mlr3::Learner$help()mlr3::Learner$predict()mlr3::Learner$predict_newdata()mlr3::Learner$print()mlr3::Learner$reset()mlr3::Learner$selected_features()mlr3::Learner$train()mlr3::LearnerRegr$predict_newdata_fast()
Examples
# Define the Learner
learner = lrn("regr.plsr")
print(learner)
#>
#> ── <LearnerRegrPlsr> (regr.plsr): Partial Least Squares Regression ─────────────
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3 and pls
#> • Predict Types: [response]
#> • Feature Types: integer, numeric, and factor
#> • Encapsulation: none (fallback: -)
#> • Properties:
#> • Other settings: use_weights = 'error'
# Define a Task
task = tsk("mtcars")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#> Partial least squares regression, fitted with the kernel algorithm.
#> Call:
#> plsr(formula = formula, data = task$data())
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> regr.mse
#> 16.65331