Skip to contents

Shrinkage Discriminant Analysis for classification. Calls sda::sda() from sda.

Dictionary

This Learner can be instantiated via lrn():

lrn("classif.sda")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “integer”, “numeric”

  • Required Packages: mlr3, sda

Parameters

IdTypeDefaultLevelsRange
lambdanumeric-\([0, 1]\)
lambda.varnumeric-\([0, 1]\)
lambda.freqsnumeric-\([0, 1]\)
diagonallogicalFALSETRUE, FALSE-
verboselogicalFALSETRUE, FALSE-

References

Ahdesmaeki, Miika, Strimmer, Korbinian (2010). “Feature selection in omics prediction problems using cat scores and false nondiscovery rate control.” The Annals of Applied Statistics, 4(1). ISSN 1932-6157, doi:10.1214/09-aoas277 , http://dx.doi.org/10.1214/09-AOAS277.

See also

Author

annanzrv

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifSda

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifSda$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner
learner = lrn("classif.sda")
print(learner)
#> 
#> ── <LearnerClassifSda> (classif.sda): Shrinkage Discriminant Analysis ──────────
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3 and sda
#> • Predict Types: [response] and prob
#> • Feature Types: integer and numeric
#> • Encapsulation: none (fallback: -)
#> • Properties: multiclass and twoclass
#> • Other settings: use_weights = 'error', predict_raw = 'FALSE'

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
#> Number of variables: 60 
#> Number of observations: 139 
#> Number of classes: 2 
#> 
#> Estimating optimal shrinkage intensity lambda.freq (frequencies): 0.3102 
#> Estimating variances (pooled across classes)
#> Estimating optimal shrinkage intensity lambda.var (variance vector): 0.0245 
#> 
#> 
#> Computing inverse correlation matrix (pooled across classes)
#> Estimating optimal shrinkage intensity lambda (correlation matrix): 0.1099 

print(learner$model)
#> $regularization
#>       lambda   lambda.var lambda.freqs 
#>    0.1099006    0.0244882    0.3102304 
#> 
#> $freqs
#>         M         R 
#> 0.5521049 0.4478951 
#> 
#> $alpha
#>         M         R 
#> -3.773401  1.781078 
#> 
#> $beta
#>           V1        V10       V11       V12        V13       V14        V15
#> M   9.154393 -0.3691058  1.866170  2.847940  0.6527887 -1.060950 -0.5060497
#> R -11.284305  0.4549840 -2.300363 -3.510558 -0.8046702  1.307797  0.6237901
#>          V16         V17        V18        V19        V2       V20       V21
#> M -0.3300628 -0.02311985  0.5240475 -0.8063780  2.146475  0.605229  1.003024
#> R  0.4068571  0.02849904 -0.6459753  0.9939945 -2.645886 -0.746045 -1.236393
#>          V22        V23        V24       V25        V26        V27        V28
#> M -0.1742255  0.6967228  0.8952808 -0.907651 -0.8189085  0.8596603  0.5368396
#> R  0.2147617 -0.8588263 -1.1035819  1.118830  1.0094404 -1.0596737 -0.6617438
#>         V29        V3        V30       V31        V32       V33        V34
#> M -0.940819 -11.11262  0.8124123 -2.378107  0.3801640  1.240275 -0.1257663
#> R  1.159715  13.69814 -1.0014328  2.931411 -0.4686151 -1.528845  0.1550278
#>          V35       V36       V37        V38        V39        V4       V40
#> M -0.3421681 -1.395254 -1.306677  0.8068621  0.7380677  11.03539 -1.875012
#> R  0.4217789  1.719882  1.610696 -0.9945912 -0.9097907 -13.60294  2.311263
#>         V41        V42        V43       V44         V45       V46       V47
#> M  2.395362 -0.7527788  0.7951606  2.309916  0.04893310  1.657572  1.288919
#> R -2.952680  0.9279245 -0.9801672 -2.847354 -0.06031815 -2.043232 -1.588806
#>         V48       V49        V5       V50       V51        V52       V53
#> M  2.158396  4.728186 -1.159483 -12.42667  3.399580 -0.4422931 -2.022444
#> R -2.660581 -5.828272  1.429255  15.31793 -4.190546  0.5451996  2.492998
#>         V54      V55       V56         V57       V58       V59        V6
#> M  3.181033 -8.42565 -6.098489  0.04598250  1.037005  1.045275  3.311865
#> R -3.921150 10.38601  7.517398 -0.05668105 -1.278281 -1.288474 -4.082422
#>         V60        V7        V8        V9
#> M -3.507230 -1.744214 -1.252518  2.355369
#> R  4.323242  2.150032  1.543936 -2.903383
#> attr(,"class")
#> [1] "shrinkage"
#> 
#> attr(,"class")
#> [1] "sda"


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
#> Prediction uses 60 features.

# Score the predictions
predictions$score()
#> classif.ce 
#>  0.2028986