Skip to contents

Stochastic Gradient Descent for learning various linear models. Calls RWeka::make_Weka_classifier() from RWeka.

Initial parameter values

  • F:

    • Has only 3 out of 5 original loss functions: 2 = squared loss (regression), 3 = epsilon insensitive loss (regression) and 4 = Huber loss (regression) with 2 (squared loss) being the new default

    • Reason for change: this learner should only contain loss functions appropriate for regression tasks

Custom mlr3 parameters

  • output_debug_info:

    • original id: output-debug-info

  • do_not_check_capabilities:

    • original id: do-not-check-capabilities

  • num_decimal_places:

    • original id: num-decimal-places

  • batch_size:

    • original id: batch-size

  • Reason for change: This learner contains changed ids of the following control arguments since their ids contain irregular pattern

Dictionary

This Learner can be instantiated via lrn():

lrn("regr.sgd")

Meta Information

  • Task type: “regr”

  • Predict Types: “response”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, RWeka

Parameters

IdTypeDefaultLevelsRange
subsetuntyped--
na.actionuntyped--
Fcharacter22, 3, 4-
Lnumeric0.01\((-\infty, \infty)\)
Rnumeric1e-04\((-\infty, \infty)\)
Einteger500\((-\infty, \infty)\)
Cnumeric0.001\((-\infty, \infty)\)
Nlogical-TRUE, FALSE-
Mlogical-TRUE, FALSE-
Sinteger1\((-\infty, \infty)\)
output_debug_infologicalFALSETRUE, FALSE-
do_not_check_capabilitieslogicalFALSETRUE, FALSE-
num_decimal_placesinteger2\([1, \infty)\)
batch_sizeinteger100\([1, \infty)\)
optionsuntypedNULL-

See also

Author

damirpolat

Super classes

mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrSGD

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerRegrSGD$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = mlr3::lrn("regr.sgd")
print(learner)
#> <LearnerRegrSGD:regr.sgd>: Stochastic Gradient Descent
#> * Model: -
#> * Parameters: F=2
#> * Packages: mlr3, RWeka
#> * Predict Types:  [response]
#> * Feature Types: logical, integer, numeric, factor, ordered
#> * Properties: -

# Define a Task
task = mlr3::tsk("mtcars")

# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> Loss function: Squared loss (linear regression)
#> 
#> mpg = 
#> 
#>          4.8808 (normalized) am
#>  +      -5.2087 (normalized) carb
#>  +       2.2521 (normalized) cyl
#>  +      -0.3433 (normalized) disp
#>  +      -0.9993 (normalized) drat
#>  +       1.7145 (normalized) gear
#>  +      -4.3602 (normalized) hp
#>  +       5.4375 (normalized) qsec
#>  +       2.8273 (normalized) vs
#>  +      -5.6889 (normalized) wt
#>  +      18.7241


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> regr.mse 
#> 8.809233