mlr_learners_classif.catboost.Rd
Calls catboost::catboost.train from package catboost.
This Learner can be instantiated via the
dictionary mlr_learners or with the associated
sugar function lrn()
:
mlr_learners$get("classif.catboost") lrn("classif.catboost")
Packages: catboost
Predict Types: response, prob
Feature Types: logical, integer, numeric, factor, ordered
Properties: importance, missings, multiclass, twoclass, weights
logging_level
:
Actual default: "Verbose"
Adjusted default: "Silent"
Reason for change: consistent with other mlr3 learners
thread_count
:
Actual default: -1
Adjusted default: 1
Reason for change: consistent with other mlr3 learners
allow_writing_files
:
Actual default: TRUE
Adjusted default: FALSE
Reason for change: consistent with other mlr3 learners
save_snapshot
:
Actual default: TRUE
Adjusted default: FALSE
Reason for change: consistent with other mlr3 learners
CatBoost: unbiased boosting with categorical features. Liudmila Prokhorenkova, Gleb Guse, Aleksandr Vorobev, Anna Veronika Dorogush and Andrey Gulin. 2017. https://arxiv.org/abs/1706.09516.
CatBoost: gradient boosting with categorical features support. Anna Veronika Dorogush, Vasily Ershov and Andrey Gulin. 2018. https://arxiv.org/abs/1810.11363.
sumny
mlr3::Learner
-> mlr3::LearnerClassif
-> LearnerClassifCatboost
new()
Create a LearnerClassifCatboost
object.
LearnerClassifCatboost$new()
importance()
The importance scores are calculated using
catboost.get_feature_importance
,
setting type = "FeatureImportance"
, returned for 'all'.
LearnerClassifCatboost$importance()
Named numeric()
.
clone()
The objects of this class are cloneable with this method.
LearnerClassifCatboost$clone(deep = FALSE)
deep
Whether to make a deep clone.
# stop example failing with warning if package not installed learner = suppressWarnings(mlr3::lrn("classif.catboost")) print(learner)#> <LearnerClassifCatboost:classif.catboost> #> * Model: - #> * Parameters: loss_function_twoclass=Logloss, #> loss_function_multiclass=MultiClass, logging_level=Silent, #> thread_count=1, allow_writing_files=FALSE, save_snapshot=FALSE #> * Packages: catboost #> * Predict Type: response #> * Feature types: logical, integer, numeric, factor, ordered #> * Properties: importance, missings, multiclass, twoclass, weights# available parameters: learner$param_set$ids()#> [1] "loss_function_twoclass" "loss_function_multiclass" #> [3] "iterations" "learning_rate" #> [5] "random_seed" "l2_leaf_reg" #> [7] "bootstrap_type" "bagging_temperature" #> [9] "subsample" "sampling_frequency" #> [11] "sampling_unit" "mvs_reg" #> [13] "random_strength" "depth" #> [15] "grow_policy" "min_data_in_leaf" #> [17] "max_leaves" "has_time" #> [19] "rsm" "nan_mode" #> [21] "fold_permutation_block" "leaf_estimation_method" #> [23] "leaf_estimation_iterations" "leaf_estimation_backtracking" #> [25] "fold_len_multiplier" "approx_on_full_history" #> [27] "class_weights" "auto_class_weights" #> [29] "boosting_type" "boost_from_average" #> [31] "langevin" "diffusion_temperature" #> [33] "score_function" "monotone_constraints" #> [35] "feature_weights" "first_feature_use_penalties" #> [37] "penalties_coefficient" "per_object_feature_penalties" #> [39] "model_shrink_rate" "model_shrink_mode" #> [41] "target_border" "border_count" #> [43] "feature_border_type" "per_float_feature_quantization" #> [45] "classes_count" "thread_count" #> [47] "task_type" "devices" #> [49] "logging_level" "metric_period" #> [51] "train_dir" "model_size_reg" #> [53] "allow_writing_files" "save_snapshot" #> [55] "snapshot_file" "snapshot_interval" #> [57] "simple_ctr" "combinations_ctr" #> [59] "ctr_target_border_count" "counter_calc_method" #> [61] "max_ctr_complexity" "ctr_leaf_count_limit" #> [63] "store_all_simple_ctr" "final_ctr_computation_mode" #> [65] "verbose" "ntree_start" #> [67] "ntree_end"