Skip to contents

Survival Partition Tree where a significance test is used to determine the univariate splits. Calls partykit::ctree() from partykit.

Prediction types

This learner returns two prediction types:

  1. distr: a survival matrix in two dimensions, where observations are represented in rows and time points in columns. Calculated using the internal partykit::predict.party() function.

  2. crank: the expected mortality using mlr3proba::.surv_return().

Dictionary

This Learner can be instantiated via lrn():

lrn("surv.ctree")

Meta Information

Parameters

IdTypeDefaultLevelsRange
teststatcharacterquadraticquadratic, maximum-
splitstatcharacterquadraticquadratic, maximum-
splittestlogicalFALSETRUE, FALSE-
testtypecharacterBonferroniBonferroni, MonteCarlo, Univariate, Teststatistic-
nmaxuntyped--
alphanumeric0.05\([0, 1]\)
mincriterionnumeric0.95\([0, 1]\)
logmincriterionnumeric-\((-\infty, \infty)\)
minsplitinteger20\([1, \infty)\)
minbucketinteger7\([1, \infty)\)
minprobnumeric0.01\([0, \infty)\)
stumplogicalFALSETRUE, FALSE-
lookaheadlogicalFALSETRUE, FALSE-
MIAlogicalFALSETRUE, FALSE-
nresampleinteger9999\([1, \infty)\)
tolnumeric-\([0, \infty)\)
maxsurrogateinteger0\([0, \infty)\)
numsurrogatelogicalFALSETRUE, FALSE-
mtryintegerInf\([0, \infty)\)
maxdepthintegerInf\([0, \infty)\)
maxvarinteger-\([1, \infty)\)
multiwaylogicalFALSETRUE, FALSE-
splittryinteger2\([0, \infty)\)
intersplitlogicalFALSETRUE, FALSE-
majoritylogicalFALSETRUE, FALSE-
caseweightslogicalFALSETRUE, FALSE-
applyfununtyped--
coresintegerNULL\((-\infty, \infty)\)
saveinfologicalTRUETRUE, FALSE-
updatelogicalFALSETRUE, FALSE-
splitflavourcharacterctreectree, exhaustive-
offsetuntyped--
clusteruntyped--
scoresuntyped--
doFitlogicalTRUETRUE, FALSE-
maxptsinteger25000\((-\infty, \infty)\)
absepsnumeric0.001\([0, \infty)\)
relepsnumeric0\([0, \infty)\)

References

Hothorn T, Zeileis A (2015). “partykit: A Modular Toolkit for Recursive Partytioning in R.” Journal of Machine Learning Research, 16(118), 3905-3909. http://jmlr.org/papers/v16/hothorn15a.html.

Hothorn T, Hornik K, Zeileis A (2006). “Unbiased Recursive Partitioning: A Conditional Inference Framework.” Journal of Computational and Graphical Statistics, 15(3), 651–674. doi:10.1198/106186006x133933 , https://doi.org/10.1198/106186006x133933.

See also

Author

adibender

Super classes

mlr3::Learner -> mlr3proba::LearnerSurv -> LearnerSurvCTree

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerSurvCTree$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = mlr3::lrn("surv.ctree")
print(learner)
#> <LearnerSurvCTree:surv.ctree>: Conditional Inference Tree
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, mlr3proba, mlr3extralearners, partykit, coin,
#>   sandwich
#> * Predict Types:  [crank], distr
#> * Feature Types: integer, numeric, factor, ordered
#> * Properties: weights

# Define a Task
task = mlr3::tsk("grace")

# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)
#> 
#> Model formula:
#> Surv(time, status, type = "right") ~ age + los + revasc + revascdays + 
#>     stchange + sysbp
#> 
#> Fitted party:
#> [1] root
#> |   [2] age <= 72
#> |   |   [3] revascdays <= 171
#> |   |   |   [4] revasc <= 0
#> |   |   |   |   [5] los <= 2: 2.000 (n = 24)
#> |   |   |   |   [6] los > 2
#> |   |   |   |   |   [7] age <= 62: 99.000 (n = 29)
#> |   |   |   |   |   [8] age > 62: 28.000 (n = 36)
#> |   |   |   [9] revasc > 0: Inf (n = 248)
#> |   |   [10] revascdays > 171: Inf (n = 92)
#> |   [11] age > 72
#> |   |   [12] revascdays <= 173
#> |   |   |   [13] revasc <= 0
#> |   |   |   |   [14] stchange <= 0: 67.000 (n = 32)
#> |   |   |   |   [15] stchange > 0: 20.000 (n = 64)
#> |   |   |   [16] revasc > 0: Inf (n = 104)
#> |   |   [17] revascdays > 173: Inf (n = 41)
#> 
#> Number of inner nodes:    8
#> Number of terminal nodes: 9


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> surv.cindex 
#>   0.8121094