## LSTM regression bias increases when targets go close to 0

2

I've build a LSTM model for time series forecasting. Results are not bad, with a mean normalized error of 7%. However, this normalized bias shows a clear pattern: The closer to 0 the value to predict, the higher the bias, as depicted in the figure below:

NOTE 1: True and Pred values are scaled in the figure for clarity

NOTE 2: The bias is not constant, the un-normalized bias (or bias) shows the same pattern

The model:

* 1 LSTM layer with 256 hidden units and no peep-holes
* Dropout in LSTM with keep_prob = 0.8
* One Dense Layer after the LSTM with 128 units with relu activation
* One Dense Layer after the first dense with 1 unit (predictor)
* Adam optimizer, with learning rate = 0.001
* Loss: Mean Squared Error


Any hint?

The True values are not visible. Are they nonnegative? – Sean Owen – 2020-05-09T01:31:39.973