5

3

I have a time series daily data for about 6 years(1.8k data points). I am trying to forecast the next t+30 values, Train data independent matrix (X)=Sequences of previous 30 day values Train (Y)=The 31st day value for each of previous 30 day values.

I followed the following methodology to forecast: Y for t+1 is first forecasted, Then X matrix row is shifted by 1 day and the forecasted Y is appended to the end of this row, then use this row to predict t+2 value and continue. However in each sequence after (usually) t+3 days the forecasted values become constant for the rest of the t+n days.

Is this the correct way to forecast time series with LSTMs?

How can this behaviour be explained?

Will this be the case even for a time series with great seasonality?

Is this behaviour expected even for a very large time series data?

Should I rather try to train the network with Y matrix having 31st day to 60th day values for the same X?

My train data looks something like this:

```
array([[-0.35811423, -0.22393472, -0.39437897, ..., -0.36718042, -0.37080689, -0.35267452], [-0.22393472, -0.39437897, -0.13327289, ..., -0.37080689, -0.35267452, -0.2030825 ], [-0.39437897, -0.13327289, -0.1532185 , ..., -0.35267452, -0.2030825 , -0.25294651]
```

Architecture: Input LSTM layer (20 neurons) 1 hidden lstm layer 20 neurons 1 output dense layer, batch size as 1. Stateful lstm, I reset model states after each epoch.

What library are you using? Usually, yes you can do exactly what you mentioned in terms of step by step forecast, as opposed to 30 day forecasts. – Landmaster – 2017-08-08T04:05:11.383

Keras. But am getting constants with the step by step forecast. – Narahari B M – 2017-08-08T09:01:58.840

Do you mind sharing your code and data on GitHub or somewhere else? – Louis T – 2017-11-08T01:28:21.387