6

4

I'm training a neural network on *'easy'* dataset with ~15k examples.
Network overfits pretty fast.

The thing I cannot understand that after 5th epoch validation loss is starting to worsen, while precision and recall are continue to improve for 10 more epochs. (loss = binary cross-entropy)

Diving deeper:

I've checked if there is a lot of predictions around probability ~0.5, but it is not:

Also, there is a plot of percent of correct predictions based on prediction probability. There is some pattern here, but the number of elements is quite small to make conclusions.

So, my question is: why can it happen, and what to do about it?

I disagree, since my values have been also padded with zeroes, so real loss is bigger. Also, if I run this simulation multiple times, this trend continue to hold. – Vadym B. – 2018-06-05T14:58:10.677