Regression Fast Nearest Neighbor Search Learner
Source:R/learner_FNN_regr_fnn.R
mlr_learners_regr.fnn.RdFast Nearest Neighbour Regression.
Calls FNN::knn.reg() from FNN.
Meta Information
Task type: “regr”
Predict Types: “response”
Feature Types: “integer”, “numeric”
Required Packages: mlr3, mlr3extralearners, FNN
Parameters
| Id | Type | Default | Levels | Range |
| k | integer | 1 | \([1, \infty)\) | |
| algorithm | character | kd_tree | kd_tree, cover_tree, brute | - |
References
Boltz, Sylvain, Debreuve, Eric, Barlaud, Michel (2007). “kNN-based high-dimensional Kullback-Leibler distance for tracking.” In Eighth International Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS'07), 16–16. IEEE.
See also
as.data.table(mlr_learners)for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/basics.html#learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrFNN
Methods
Inherited methods
mlr3::Learner$base_learner()mlr3::Learner$configure()mlr3::Learner$encapsulate()mlr3::Learner$format()mlr3::Learner$help()mlr3::Learner$predict()mlr3::Learner$predict_newdata()mlr3::Learner$print()mlr3::Learner$reset()mlr3::Learner$selected_features()mlr3::Learner$train()mlr3::LearnerRegr$predict_newdata_fast()
Examples
# Define the Learner
learner = lrn("regr.fnn")
print(learner)
#>
#> ── <LearnerRegrFNN> (regr.fnn): Fast Nearest Neighbour ─────────────────────────
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3, mlr3extralearners, and FNN
#> • Predict Types: [response]
#> • Feature Types: integer and numeric
#> • Encapsulation: none (fallback: -)
#> • Properties:
#> • Other settings: use_weights = 'error'
# Define a Task
task = tsk("mtcars")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#> $train
#> am carb cyl disp drat gear hp qsec vs wt
#> <num> <num> <num> <num> <num> <num> <num> <num> <num> <num>
#> 1: 1 4 6 160.0 3.90 4 110 16.46 0 2.620
#> 2: 1 4 6 160.0 3.90 4 110 17.02 0 2.875
#> 3: 1 1 4 108.0 3.85 4 93 18.61 1 2.320
#> 4: 0 1 6 258.0 3.08 3 110 19.44 1 3.215
#> 5: 0 1 6 225.0 2.76 3 105 20.22 1 3.460
#> 6: 0 2 4 146.7 3.69 4 62 20.00 1 3.190
#> 7: 0 2 4 140.8 3.92 4 95 22.90 1 3.150
#> 8: 0 4 6 167.6 3.92 4 123 18.30 1 3.440
#> 9: 0 4 6 167.6 3.92 4 123 18.90 1 3.440
#> 10: 0 4 8 472.0 2.93 3 205 17.98 0 5.250
#> 11: 1 1 4 71.1 4.22 4 65 19.90 1 1.835
#> 12: 0 1 4 120.1 3.70 3 97 20.01 1 2.465
#> 13: 0 2 8 318.0 2.76 3 150 16.87 0 3.520
#> 14: 0 2 8 304.0 3.15 3 150 17.30 0 3.435
#> 15: 0 4 8 350.0 3.73 3 245 15.41 0 3.840
#> 16: 0 2 8 400.0 3.08 3 175 17.05 0 3.845
#> 17: 1 1 4 79.0 4.08 4 66 18.90 1 1.935
#> 18: 1 2 4 120.3 4.43 5 91 16.70 0 2.140
#> 19: 1 4 8 351.0 4.22 5 264 14.50 0 3.170
#> 20: 1 6 6 145.0 3.62 5 175 15.50 0 2.770
#> 21: 1 8 8 301.0 3.54 5 335 14.60 0 3.570
#> am carb cyl disp drat gear hp qsec vs wt
#>
#> $y
#> [1] 21.0 21.0 22.8 21.4 18.1 24.4 22.8 19.2 17.8 10.4 33.9 21.5 15.5 15.2 13.3
#> [16] 19.2 27.3 26.0 15.8 19.7 15.0
#>
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> regr.mse
#> 9.663939