Classification Metrics

Classification Metrics

Counts

-actual
condition_positivecondition_negative
predictedpredicted_positivetrue_positivesfalse_positives (type_1_errors)
predicted_negativefalse_negatives (type_2_errors)true_negatives

correctly_classified vs incorrectly_classified

Fractions

NameSynonymsDescription
accuracy-Fraction of correctly classified observations.
f_score-Accuracy measure that considers recall and precision.
f1_score-Harmonic mean of recall and precision.
prevalence-Fraction of truly positive outcomes in the data.
positive_predictive_valueprecision_scoreFraction of positive predicted outcomes that are true positives.
negative_predictive_value-Fraction of negative predicted outcomes that are true negatives.
true_positive_raterecall, sensitivityFraction of truly positive outcomes that were predicted as positive.
false_positive_rate-Fraction of truly negative outcomes that are (wrongly) predicted as positive (i.e. false positives).
true_negative_ratespecificityFraction of truly negative outcomes that were predicted as negative.
false_negative_rate-Fraction of truly positive outcomes that are (wrongly) predicted as negative (i.e. false negatives).
false_discovery_rate-Fraction of positive predicted outcomes that are false positives.
false_omission_rate-Fraction of negative predicted outcomes that are false negatives.
positive_likelihood_ratio-Fraction of true positive rate by false positive rate.
negative_likelihood_ratio-Fraction of false negative rate by true negative rate.
diagnostic_odds_ratio-Fraction of positive likelihood ratio by negative likelihood ratio.