Skip to contents

Shrinkage Discriminant Analysis for classification. Calls sda::sda() from sda.

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.sda")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “integer”, “numeric”

  • Required Packages: mlr3, sda

Parameters

IdTypeDefaultLevelsRange
lambdanumeric-\([0, 1]\)
lambda.varnumeric-\([0, 1]\)
lambda.freqsnumeric-\([0, 1]\)
diagonallogicalFALSETRUE, FALSE-
verboselogicalFALSETRUE, FALSE-

References

Ahdesmaeki, Miika, Strimmer, Korbinian (2010). “Feature selection in omics prediction problems using cat scores and false nondiscovery rate control.” The Annals of Applied Statistics, 4(1). ISSN 1932-6157, doi:10.1214/09-aoas277 , http://dx.doi.org/10.1214/09-AOAS277.

See also

Author

annanzrv

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifSda

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifSda$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = lrn("classif.sda")
print(learner)
#> 
#> ── <LearnerClassifSda> (classif.sda): Shrinkage Discriminant Analysis ──────────
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3 and sda
#> • Predict Types: [response] and prob
#> • Feature Types: integer and numeric
#> • Encapsulation: none (fallback: -)
#> • Properties: multiclass and twoclass
#> • Other settings: use_weights = 'error'

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
#> Number of variables: 60 
#> Number of observations: 139 
#> Number of classes: 2 
#> 
#> Estimating optimal shrinkage intensity lambda.freq (frequencies): 1 
#> Estimating variances (pooled across classes)
#> Estimating optimal shrinkage intensity lambda.var (variance vector): 0.0229 
#> 
#> 
#> Computing inverse correlation matrix (pooled across classes)
#> Estimating optimal shrinkage intensity lambda (correlation matrix): 0.1214 

print(learner$model)
#> $regularization
#>       lambda   lambda.var lambda.freqs 
#>   0.12142978   0.02291068   1.00000000 
#> 
#> $freqs
#>   M   R 
#> 0.5 0.5 
#> 
#> $alpha
#>         M         R 
#> -3.922646  1.034842 
#> 
#> $beta
#>          V1        V10       V11       V12         V13         V14        V15
#> M  5.457839  0.7169982  4.115879  3.876623  0.09675227  0.03811005 -0.3729772
#> R -5.457839 -0.7169982 -4.115879 -3.876623 -0.09675227 -0.03811005  0.3729772
#>         V16       V17          V18       V19        V2       V20        V21
#> M -1.284837 -1.082188  0.009056281  0.827816  2.920947  1.204803  0.3593092
#> R  1.284837  1.082188 -0.009056281 -0.827816 -2.920947 -1.204803 -0.3593092
#>          V22        V23        V24        V25          V26        V27
#> M -0.1433805  0.3555718  0.6944382 -0.6193893  0.006749667  0.8439633
#> R  0.1433805 -0.3555718 -0.6944382  0.6193893 -0.006749667 -0.8439633
#>          V28        V29        V3       V30       V31        V32       V33
#> M -0.5757623 -0.1772563 -4.985091  1.443441 -1.298759 -0.2135795 -1.528227
#> R  0.5757623  0.1772563  4.985091 -1.443441  1.298759  0.2135795  1.528227
#>          V34        V35       V36       V37       V38       V39        V4
#> M -0.5383805  0.6436279 -1.867636 -1.700641  1.666297  1.484386  5.267269
#> R  0.5383805 -0.6436279  1.867636  1.700641 -1.666297 -1.484386 -5.267269
#>        V40        V41       V42       V43       V44        V45       V46
#> M -1.64679  0.9298356 -1.121012  1.148269  1.294413  0.5066828  2.568246
#> R  1.64679 -0.9298356  1.121012 -1.148269 -1.294413 -0.5066828 -2.568246
#>         V47       V48       V49        V5       V50       V51        V52
#> M  2.317575  5.076653  8.029836  1.398809 -9.362241 -9.767148 -0.5411384
#> R -2.317575 -5.076653 -8.029836 -1.398809  9.362241  9.767148  0.5411384
#>         V53       V54       V55       V56       V57       V58       V59
#> M -4.991626  2.825677 -18.71165 -0.528352 -5.488967  1.381163  15.11236
#> R  4.991626 -2.825677  18.71165  0.528352  5.488967 -1.381163 -15.11236
#>          V6        V60        V7        V8         V9
#> M  1.370712 -0.3755266 -2.618261 -4.729482 -0.2439663
#> R -1.370712  0.3755266  2.618261  4.729482  0.2439663
#> attr(,"class")
#> [1] "shrinkage"
#> 
#> attr(,"class")
#> [1] "sda"


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
#> Prediction uses 60 features.

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.2898551