## What is a reasonable way to compare the improvement in accuracy?

2

Suppose you have a classification task and accuracy is what you care about. Now an old system $s_1$ has an accuracy of $a(s_1) = 0.9$ and a new system $s_2$ has an accuracy of $a(s_2) = 0.99$. This is an absolute improvement of $a(s_2) - a(s_1) = 0.09$ percentage points and a relative improvement of $\frac{a(s_2)-a(s_1)}{a(s_1)} = \frac{0.09}{0.9} = 0.1$.

However, when you now try to get a system of $a(s_3) = 0.999$ this seems to be much more difficult, although it is only an absolute improvement of $0.009$ and a relative improvement of $0.00\overline{90}$. So neither the absolute nor the relative difference in accuracy seems to capture this well.

Is there a common other way to quantify how much better the system is?

The tag is probably chosen really bad, but I have no idea what would be a better tag. – Martin Thoma – 2016-05-12T19:22:05.713