14

4

I'm a little bit new to machine learning.

I am using a neural network to classify images. There are two possible classes. I am using `Sigmoid`

activation at the last layer so the scores of images are between 0 to 1.

I expected the scores to be sometimes close to 0.5 when the neural net is not sure about the class of the image, but all scores are either 1.0000000e+00 (due to rounding I guess) or very close to zero (for exemple 2.68440009e-15). In general, is that a good or bad thing ? I have the feeling it's not. If it is not, why? and how can it be avoided?

In my use case I wanted to optimize for recall by manually setting the necessary score to classify an image as beonging to class 1 to be greater than 0.6 or 0.7 instead of 0.5, but this has no impact because of what I described above.

More generally, how can I minimize the number of false negatives when in training the neural net only cares about my not ad-hoc loss ? I am ok with decreasing accuracy a little bit to increase recall.

I am just using sigmoid activation but I am indeed using binary cross entropy. I used a confusion matrix and with only two classes it's easy to know which classes are mixed up ;) Would you recommend changing my loss so that distance to the right answer is taken into account and I can manually set a threshold that's better to improve either precision or recall ? – Louis – 2018-03-09T18:35:37.253

I missed the fact that you only have 2 classes. In that case the information how "wrong" the prediction is (weight for wrong class) is mirrored by how "right" the prediction is (weight for correct class). Binary cross entropy loss comes down to log (p) with p=predicted value for the correct class. The smaller p, the larger the loss. The distance to the right class is already considered. Both predicted classes are treated equally here. But it sounds like for you they are not. You would prefer having more predictions for one class to avoid false negatives even if it hurts accuracy, right? – Gegenwind – 2018-03-10T07:28:47.890

Thank you very much for your answers :) Yes exactly you summed it up perfectly – Louis – 2018-03-10T10:36:34.567