Current Slide

Small screen detected. You are viewing the mobile version of SlideWiki. If you wish to edit slides you will need to use a larger device.

Classifier Evaluation Metrics: Precision and Recall, and F-measures

  • Precision: exactness – what % of tuples that the classifier labeled as positive are actually positive

\[precision=\frac{TP}{TP+FP}\]

  • Recall: completeness – what % of positive tuples did the classifier label as positive?

\[recall=\frac{TP}{TP+FN}\]

  • Perfect score is 1.0
  • Inverse relationship between precision & recall
  • F measure (F1 or F-score): harmonic mean of precision and recall,

\[F=\frac{2\times precision \times recall}{precision+recall}\]

  • : weighted measure of precision and recall
    • assigns ß times as much weight to recall as to precision
\[F_{\beta }=\frac{(1+\beta ^{2})\times precision \times recall}{\beta ^{2}\times precision+recall }\]

Speaker notes:

Content Tools

Sources

There are currently no sources for this slide.