General Architecture for a time series LSTM

Hi All,

I have finished the part one course and I am just trying a few things out before I move onto the second part of the course.

One thing I am trying to do is an LSTM for predicting the 10th float in a sequence, based off the previous 9.

The data looks like this:

[11.0, 11.0, 11.0, 12.0, 11.5, 12.0, 13.0, 13.5, 14.0, 15.5]

I would like to be able to predict 15.5, based of the previous 9 numbers.

There are 18,000 rows similar to this.

So I have a couple of questions:

  1. Is this a problem for LSTM?
  2. Are there any tutorials out there for Keras that anyone has seen that deal with a similar sort of problem?

I look forward to any feedback.

Kind regards,

Luke

Hi Luke,

I think you will find that the dataset will be too shallow for LSTM to learn anything meaningful, but an LSTM implementation can be as simple as just adding an LSTM() layer.

example from keras doc: https://github.com/fchollet/keras/blob/master/examples/imdb_lstm.py

        model = Sequential()
        model.add(LSTM(9))
        model.add(Dense(1))
        model.compile(loss='mean_squared_error', optimizer='adam')

Good luck!

Jerry

1 Like

Thanks Jerry, at least I have a start to go from.