Density Mixed Data Kernel Learner
mlr_learners_dens.mixed.Rd
Density estimator for discrete and continuous variables.
Calls np::npudens()
from np.
Meta Information
Task type: “dens”
Predict Types: “pdf”
Feature Types: “integer”, “numeric”
Required Packages: mlr3, mlr3proba, mlr3extralearners, np
Parameters
Id | Type | Default | Levels | Range |
bws | untyped | - | - | |
ckertype | character | gaussian | gaussian, epanechnikov, uniform | - |
bwscaling | logical | FALSE | TRUE, FALSE | - |
bwmethod | character | cv.ml | cv.ml, cv.ls, normal-reference | - |
bwtype | character | fixed | fixed, generalized_nn, adaptive_nn | - |
bandwidth.compute | logical | FALSE | TRUE, FALSE | - |
ckerorder | integer | 2 | \([2, 8]\) | |
remin | logical | TRUE | TRUE, FALSE | - |
itmax | integer | 10000 | \([1, \infty)\) | |
nmulti | integer | - | \([1, \infty)\) | |
ftol | numeric | 1.490116e-07 | \((-\infty, \infty)\) | |
tol | numeric | 0.0001490116 | \((-\infty, \infty)\) | |
small | numeric | 1.490116e-05 | \((-\infty, \infty)\) | |
lbc.dir | numeric | 0.5 | \((-\infty, \infty)\) | |
dfc.dir | numeric | 0.5 | \((-\infty, \infty)\) | |
cfac.dir | untyped | 2.5 * (3 - sqrt(5)) | - | |
initc.dir | numeric | 1 | \((-\infty, \infty)\) | |
lbd.dir | numeric | 0.1 | \((-\infty, \infty)\) | |
hbd.dir | numeric | 1 | \((-\infty, \infty)\) | |
dfac.dir | untyped | 0.25 * (3 - sqrt(5)) | - | |
initd.dir | numeric | 1 | \((-\infty, \infty)\) | |
lbc.init | numeric | 0.1 | \((-\infty, \infty)\) | |
hbc.init | numeric | 2 | \((-\infty, \infty)\) | |
cfac.init | numeric | 0.5 | \((-\infty, \infty)\) | |
lbd.init | numeric | 0.1 | \((-\infty, \infty)\) | |
hbd.init | numeric | 0.9 | \((-\infty, \infty)\) | |
dfac.init | numeric | 0.37 | \((-\infty, \infty)\) | |
ukertype | character | - | aitchisonaitken, liracine | - |
okertype | character | - | wangvanryzin, liracine | - |
References
Li, Qi, Racine, Jeff (2003). “Nonparametric estimation of distributions with categorical and continuous data.” journal of multivariate analysis, 86(2), 266–292.
See also
as.data.table(mlr_learners)
for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/basics.html#learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner
-> mlr3proba::LearnerDens
-> LearnerDensMixed
Examples
# Define the Learner
learner = mlr3::lrn("dens.mixed")
print(learner)
#> <LearnerDensMixed:dens.mixed>: Kernel Density Estimator
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, mlr3proba, mlr3extralearners, np
#> * Predict Types: [pdf]
#> * Feature Types: integer, numeric
#> * Properties: -
# Define a Task
task = mlr3::tsk("faithful")
# Create train and test set
ids = mlr3::partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#> MixedKDE_gaussian()
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
#>
Multistart 1 of 1 |
Multistart 1 of 1 |
Multistart 1 of 1 |
Multistart 1 of 1 /
Multistart 1 of 1 |
Multistart 1 of 1 |
# Score the predictions
predictions$score()
#> dens.logloss
#> 1.080301