Do I need to engineer lagged features when creating an LSTM for time series forecasting?

5

Long short-term memory networks are fairly complicated and I haven't completely wrapped my head around them.
It seems to me like the big gain in LSTMs for time series forecasting is the lacking necessity for lagged features: it determines on its own which lagged information is actually significant and remembers it for the next tiestep(s).
Should one still create lagged features as inputs for timesteps when training an LSTM? Like the output of the last timestep, or means and medians of the foregoing timesteps, means and medians for a specific class, distances, differences, etc.?

Denny

Posted 2019-04-05T21:29:58.780

Reputation: 73

Answers

2

TL;DR

No, you don't have to include lagged variables when using an LSTM.

Long Answer

In an LSTM architecture, your neurons have not only an input gate and an output gate, but they also have a forget gate which causes them to remember values over multiple time intervals. So, you don't have to include any lagged variables in your feature set, since you can expect the LSTM to discover relevant time-depended relationships on its own. This is not only the case for LSTM's but also for other RNN-architectures.

georg-un

Posted 2019-04-05T21:29:58.780

Reputation: 1 106