Skip to contents

Generalized linear model. Calls stats::glm() from base package 'stats'. For logistic regression please use mlr_learners_classif.log_reg.

Dictionary

This Learner can be instantiated via lrn():

lrn("regr.glm")

Meta Information

  • Task type: “regr”

  • Predict Types: “response”, “se”

  • Feature Types: “logical”, “integer”, “numeric”, “character”, “factor”, “ordered”

  • Required Packages: mlr3, mlr3extralearners, 'stats'

Parameters

IdTypeDefaultLevelsRange
singular.oklogicalTRUETRUE, FALSE-
xlogicalFALSETRUE, FALSE-
ylogicalTRUETRUE, FALSE-
modellogicalTRUETRUE, FALSE-
etastartuntyped--
mustartuntyped--
startuntypedNULL-
offsetuntyped--
familycharactergaussiangaussian, poisson, quasipoisson, Gamma, inverse.gaussian-
na.actioncharacter-na.omit, na.pass, na.fail, na.exclude-
linkcharacter-logit, probit, cauchit, cloglog, identity, log, sqrt, 1/mu^2, inverse-
epsilonnumeric1e-08\((-\infty, \infty)\)
maxitnumeric25\((-\infty, \infty)\)
tracelogicalFALSETRUE, FALSE-
dispersionuntypedNULL-
typecharacterlinkresponse, link, terms-

Initial parameter values

  • type

    • Actual default: "link"

    • Adjusted default: "response"

    • Reason for change: Response scale more natural for predictions.

References

Hosmer Jr, W D, Lemeshow, Stanley, Sturdivant, X R (2013). Applied logistic regression, volume 398. John Wiley & Sons.

See also

Author

salauer

Super classes

mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrGlm

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerRegrGlm$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = mlr3::lrn("regr.glm")
print(learner)
#> <LearnerRegrGlm:regr.glm>: Generalized Linear Regression
#> * Model: -
#> * Parameters: family=gaussian, type=response
#> * Packages: mlr3, mlr3extralearners, stats
#> * Predict Types:  [response], se
#> * Feature Types: logical, integer, numeric, character, factor, ordered
#> * Properties: weights

# Define a Task
task = mlr3::tsk("mtcars")

# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> 
#> Call:  stats::glm(formula = formula, family = structure(list(family = "gaussian", 
#>     link = "identity", linkfun = function (mu) 
#>     mu, linkinv = function (eta) 
#>     eta, variance = function (mu) 
#>     rep.int(1, length(mu)), dev.resids = function (y, mu, wt) 
#>     wt * ((y - mu)^2), aic = function (y, n, mu, wt, dev) 
#>     {
#>         nobs <- length(y)
#>         nobs * (log(dev/nobs * 2 * pi) + 1) + 2 - sum(log(wt))
#>     }, mu.eta = function (eta) 
#>     rep.int(1, length(eta)), initialize = expression({
#>         n <- rep.int(1, nobs)
#>         if (is.null(etastart) && is.null(start) && is.null(mustart) && 
#>             ((family$link == "inverse" && any(y == 0)) || (family$link == 
#>                 "log" && any(y <= 0)))) 
#>             stop("cannot find valid starting values: please specify some")
#>         mustart <- y
#>     }), validmu = function (mu) 
#>     TRUE, valideta = function (eta) 
#>     TRUE, dispersion = NA_real_), class = "family"), data = data)
#> 
#> Coefficients:
#> (Intercept)           am         carb          cyl         disp         drat  
#>     3.86536      2.85194      0.80641      0.25225      0.02095      1.78992  
#>        gear           hp         qsec           vs           wt  
#>    -1.35641     -0.03043      1.26418      1.56805     -4.32898  
#> 
#> Degrees of Freedom: 20 Total (i.e. Null);  10 Residual
#> Null Deviance:	    615.5 
#> Residual Deviance: 81.59 	AIC: 112.1


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> regr.mse 
#> 11.35362