What is a reasonable way to compare the improvement in accuracy?

2

Suppose you have a classification task and accuracy is what you care about. Now an old system $s_1$ has an accuracy of $a(s_1) = 0.9$ and a new system $s_2$ has an accuracy of $a(s_2) = 0.99$. This is an absolute improvement of $a(s_2) - a(s_1) = 0.09$ percentage points and a relative improvement of $\frac{a(s_2)-a(s_1)}{a(s_1)} = \frac{0.09}{0.9} = 0.1$.

However, when you now try to get a system of $a(s_3) = 0.999$ this seems to be much more difficult, although it is only an absolute improvement of $0.009$ and a relative improvement of $0.00\overline{90}$. So neither the absolute nor the relative difference in accuracy seems to capture this well.

Is there a common other way to quantify how much better the system is?

Martin Thoma

Posted 2016-05-12T19:21:37.730

Reputation: 15 590

The tag is probably chosen really bad, but I have no idea what would be a better tag. – Martin Thoma – 2016-05-12T19:22:05.713

Answers

1

In high accuracy regimes (>0.9), I look at the error rate -- the complement of the accuracy -- or its reciprocal; the mean time between errors. This captures the intuition that an accuracy of 0.99 is ten times as "good" as an accuracy of 0.9.

Emre

Posted 2016-05-12T19:21:37.730

Reputation: 9 953