Skip to contents

Categorical Regression Splines. Calls crs::crs() from crs.

Dictionary

This Learner can be instantiated via lrn():

lrn("regr.crs")

Meta Information

  • Task type: “regr”

  • Predict Types: “response”, “se”

  • Feature Types: “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, crs

Parameters

IdTypeDefaultLevelsRange
degreeinteger3\([0, \infty)\)
segmentsinteger1\([1, \infty)\)
includeinteger-\((-\infty, \infty)\)
lambdauntyped--
lambda.discretelogicalFALSETRUE, FALSE-
lambda.discrete.numinteger100\([0, \infty)\)
cvcharacternomadnomad, exhaustive, none-
cv.thresholdinteger1000\([0, \infty)\)
cv.funccharactercv.lscv.ls, cv.gcv, cv.aic-
kernellogicalTRUETRUE, FALSE-
degree.maxinteger10\([0, \infty)\)
segments.maxinteger10\([1, \infty)\)
degree.mininteger0\([0, \infty)\)
segments.mininteger1\([1, \infty)\)
cv.df.mininteger1\((-\infty, \infty)\)
complexitycharacterdegree-knotsdegree-knots, degree, knots-
knotscharacterquantilesquantiles, uniform, auto-
basischaracterautoauto, additive, tensor, glp-
prunelogicalFALSETRUE, FALSE-
restartsinteger0\([0, \infty)\)
nmultiinteger5\([0, \infty)\)
singular.oklogicalFALSETRUE, FALSE-
derivinteger0\([0, \infty)\)
data.returnlogicalFALSETRUE, FALSE-
model.returnlogicalFALSETRUE, FALSE-
random.seedinteger-\((-\infty, \infty)\)
taunumeric-\([0, 1]\)
initial.mesh.size.realuntyped--
initial.mesh.size.integeruntyped--
max.bb.evaluntyped--
min.mesh.size.realuntyped--
min.mesh.size.integeruntyped--
min.frame.size.realuntyped--
min.frame.size.integeruntyped--
display.nomad.progresslogicalTRUETRUE, FALSE-
display.warningslogicalTRUETRUE, FALSE-
optsuntyped--

See also

Author

annanzrv

Super classes

mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrCrs

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerRegrCrs$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = lrn("regr.crs", cv = "none")
print(learner)
#> 
#> ── <LearnerRegrCrs> (regr.crs): Regression Splines ─────────────────────────────
#> • Model: -
#> • Parameters: cv=none
#> • Packages: mlr3 and crs
#> • Predict Types: [response] and se
#> • Feature Types: integer, numeric, factor, and ordered
#> • Encapsulation: none (fallback: -)
#> • Properties: weights
#> • Other settings: use_weights = 'use', predict_raw = 'FALSE'

# Define a Task
task = tsk("mtcars")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
#> Warning:  cv="none" selected but no degree provided, using degree=rep(3,num.x): you might consider other degree settings
#> Warning:  cv="none" selected but no segments provided, using segments=rep(1,num.x): you might consider other segment settings
#> Warning:  cv="none" selected, basis="auto" changed to basis="additive": you might consider basis="tensor" etc.
#> Warning: NaNs produced

print(learner$model)
#> Call:
#> crs.formula(formula = formula, cv = "none", data = data, weights = private$.get_weights(task))

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
#> Warning: some 'x' values beyond boundary knots may cause ill-conditioned bases
#> Warning: some 'x' values beyond boundary knots may cause ill-conditioned bases
#> Warning: some 'x' values beyond boundary knots may cause ill-conditioned bases
#> Warning: some 'x' values beyond boundary knots may cause ill-conditioned bases
#> Warning: prediction from rank-deficient fit; attr(*, "non-estim") has doubtful cases
#> Warning: NaNs produced

# Score the predictions
predictions$score()
#> regr.mse 
#> 80565.21