Generalized linear classification model.
Calls h2o::h2o.glm() from package h2o with family always set to "binomial".
H2O Connection
If no running H2O connection is found, the learner will automatically start a local H2O server
on 127.0.0.1 via h2o::h2o.init().
If you want to connect to a remote H2O cluster, call h2o::h2o.init() with the appropriate
arguments before training or predicting.
Meta Information
Task type: “classif”
Predict Types: “response”, “prob”
Feature Types: “integer”, “numeric”, “factor”
Required Packages: mlr3, mlr3extralearners, h2o
Parameters
| Id | Type | Default | Levels | Range |
| alpha | numeric | 0.5 | \([0, 1]\) | |
| balance_classes | logical | FALSE | TRUE, FALSE | - |
| beta_constraints | untyped | NULL | - | |
| beta_epsilon | numeric | 1e-04 | \([0, \infty)\) | |
| build_null_model | logical | FALSE | TRUE, FALSE | - |
| calc_like | logical | FALSE | TRUE, FALSE | - |
| checkpoint | untyped | NULL | - | |
| class_sampling_factors | untyped | NULL | - | |
| cold_start | logical | FALSE | TRUE, FALSE | - |
| compute_p_values | logical | FALSE | TRUE, FALSE | - |
| early_stopping | logical | TRUE | TRUE, FALSE | - |
| export_checkpoints_dir | untyped | NULL | - | |
| gainslift_bins | integer | -1 | \([-1, \infty)\) | |
| generate_scoring_history | logical | FALSE | TRUE, FALSE | - |
| generate_variable_inflation_factors | logical | FALSE | TRUE, FALSE | - |
| gradient_epsilon | numeric | -1 | \([0, \infty)\) | |
| HGLM | logical | FALSE | TRUE, FALSE | - |
| ignore_const_cols | logical | TRUE | TRUE, FALSE | - |
| interactions | untyped | NULL | - | |
| interaction_pairs | untyped | NULL | - | |
| intercept | logical | TRUE | TRUE, FALSE | - |
| lambda | numeric | 1e-05 | \([0, \infty)\) | |
| lambda_min_ratio | numeric | -1 | \([0, 1]\) | |
| lambda_search | logical | FALSE | TRUE, FALSE | - |
| link | character | logit | family_default, logit | - |
| max_active_predictors | integer | -1 | \([1, \infty)\) | |
| max_after_balance_size | numeric | 5 | \([0, \infty)\) | |
| max_iterations | integer | -1 | \([0, \infty)\) | |
| max_runtime_secs | numeric | 0 | \([0, \infty)\) | |
| missing_values_handling | character | MeanImputation | MeanImputation, Skip, PlugValues | - |
| nlambdas | integer | -1 | \([1, \infty)\) | |
| non_negative | logical | FALSE | TRUE, FALSE | - |
| objective_epsilon | numeric | -1 | \([0, \infty)\) | |
| obj_reg | numeric | -1 | \([0, \infty)\) | |
| plug_values | untyped | NULL | - | |
| prior | numeric | -1 | \([0, \infty)\) | |
| random_columns | untyped | NULL | - | |
| remove_collinear_columns | logical | FALSE | TRUE, FALSE | - |
| score_each_iteration | logical | FALSE | TRUE, FALSE | - |
| score_iteration_interval | integer | -1 | \((-\infty, \infty)\) | |
| seed | integer | -1 | \((-\infty, \infty)\) | |
| solver | character | AUTO | AUTO, IRLSM, L_BFGS, COORDINATE_DESCENT, COORDINATE_DESCENT_NAIVE | - |
| standardize | logical | TRUE | TRUE, FALSE | - |
| startval | untyped | NULL | - | |
| stopping_metric | character | AUTO | AUTO, logloss, AUC, AUCPR, lift_top_group, misclassification, mean_per_class_error | - |
| stopping_rounds | integer | 0 | \([0, \infty)\) | |
| stopping_tolerance | numeric | 0.001 | \([0, \infty)\) |
References
Fryda T, LeDell E, Gill N, Aiello S, Fu A, Candel A, Click C, Kraljevic T, Nykodym T, Aboyoun P, Kurka M, Malohlava M, Poirier S, Wong W (2025). h2o: R Interface for the 'H2O' Scalable Machine Learning Platform. R package version 3.46.0.9, https://github.com/h2oai/h2o-3.
See also
as.data.table(mlr_learners)for a table of available Learners in the running session (depending on the loaded packages).Chapter in the mlr3book: https://mlr3book.mlr-org.com/chapters/chapter2/data_and_basic_modeling.html#sec-learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Super classes
mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifH2OGLM
Methods
Inherited methods
mlr3::Learner$base_learner()mlr3::Learner$configure()mlr3::Learner$encapsulate()mlr3::Learner$format()mlr3::Learner$help()mlr3::Learner$predict()mlr3::Learner$predict_newdata()mlr3::Learner$print()mlr3::Learner$reset()mlr3::Learner$selected_features()mlr3::Learner$train()mlr3::LearnerClassif$predict_newdata_fast()
Examples
# Define the Learner
learner = lrn("classif.h2o.glm")
print(learner)
#>
#> ── <LearnerClassifH2OGLM> (classif.h2o.glm): H2O GLM ───────────────────────────
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3, mlr3extralearners, and h2o
#> • Predict Types: [response] and prob
#> • Feature Types: integer, numeric, and factor
#> • Encapsulation: none (fallback: -)
#> • Properties: missings, twoclass, and weights
#> • Other settings: use_weights = 'use', predict_raw = 'FALSE'
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
print(learner$model)
#> Model Details:
#> ==============
#>
#> H2OBinomialModel: glm
#> Model ID: GLM_model_R_1776854555489_56
#> GLM Model: summary
#> family link regularization
#> 1 binomial logit Elastic Net (alpha = 0.5, lambda = 0.04547 )
#> number_of_predictors_total number_of_active_predictors number_of_iterations
#> 1 60 26 6
#> training_frame
#> 1 data_sid_97c2_11
#>
#> Coefficients: glm coefficients
#> names coefficients standardized_coefficients
#> 1 Intercept 3.153003 -0.522736
#> 2 V1 -4.121202 -0.103683
#> 3 V10 0.000000 0.000000
#> 4 V11 -2.751040 -0.347959
#> 5 V12 -3.604959 -0.484310
#>
#> ---
#> names coefficients standardized_coefficients
#> 56 V59 -11.881581 -0.073796
#> 57 V6 0.000000 0.000000
#> 58 V60 0.000000 0.000000
#> 59 V7 0.000000 0.000000
#> 60 V8 0.000000 0.000000
#> 61 V9 -1.437604 -0.173940
#>
#> H2OBinomialMetrics: glm
#> ** Reported on training data. **
#>
#> MSE: 0.1229228
#> RMSE: 0.3506034
#> LogLoss: 0.3999967
#> Mean Per-Class Error: 0.1577267
#> AUC: 0.9208174
#> AUCPR: 0.912105
#> Gini: 0.8416347
#> R^2: 0.4944677
#> Residual Deviance: 111.1991
#> AIC: 165.1991
#>
#> Confusion Matrix (vertical: actual; across: predicted) for F1-optimal threshold:
#> M R Error Rate
#> M 75 6 0.074074 =6/81
#> R 14 44 0.241379 =14/58
#> Totals 89 50 0.143885 =20/139
#>
#> Maximum Metrics: Maximum metrics at their respective thresholds
#> metric threshold value idx
#> 1 max f1 0.551031 0.814815 49
#> 2 max f2 0.278398 0.887850 88
#> 3 max f0point5 0.590469 0.876068 43
#> 4 max accuracy 0.590469 0.856115 43
#> 5 max precision 0.948568 1.000000 0
#> 6 max recall 0.056209 1.000000 128
#> 7 max specificity 0.948568 1.000000 0
#> 8 max absolute_mcc 0.590469 0.710153 43
#> 9 max min_per_class_accuracy 0.429437 0.827160 61
#> 10 max mean_per_class_accuracy 0.551031 0.842273 49
#> 11 max tns 0.948568 81.000000 0
#> 12 max fns 0.948568 57.000000 0
#> 13 max fps 0.005597 81.000000 138
#> 14 max tps 0.056209 58.000000 128
#> 15 max tnr 0.948568 1.000000 0
#> 16 max fnr 0.948568 0.982759 0
#> 17 max fpr 0.005597 1.000000 138
#> 18 max tpr 0.056209 1.000000 128
#>
#> Gains/Lift Table: Extract with `h2o.gainsLift(<model>, <data>)` or `h2o.gainsLift(<model>, valid=<T/F>, xval=<T/F>)`
#>
#>
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> classif.ce
#> 0.2463768