Home > Blockchain >  LSTM with different timestep when predicting
LSTM with different timestep when predicting

Time:12-11

Let's say I have some timeseries data with shape (80,27) which means 80 timesteps and 27 features. I want to train the following network in a way to predict each timestep separately and not the 80 timesteps together because in the prediction phase my input shape is (1,27) each timestep t until I reach t = 80 . So I have to find a way to predict 80 samples-timesteps of (1,27) without losing backpropagation through time when training.

Any suggestions?

def Model():
    inputs = layers.Input(shape=(80,27))
    
    
    x = layers.Conv1D(64,kernel_size=5,activation="relu",padding="same")(inputs)
    x = layers.Bidirectional(layers.LSTM(256,return_sequences=True))(x)
    x = layers.Bidirectional(layers.LSTM(256,return_sequences=True))(x)
    x = layers.Bidirectional(layers.LSTM(256,return_sequences=True))(x)
    x = layers.Bidirectional(layers.LSTM(128,return_sequences=True))(x)
    x = layers.Bidirectional(layers.LSTM(128,return_sequences=True))(x)
    x = layers.Bidirectional(layers.LSTM(128,return_sequences=True))(x)

    
    x = layers.Dense(512,activation="selu")(x)
    x = layers.Dense(256,activation="selu")(x)
    
    x = layers.Dense(2)(x)
   
    
    return keras.Model(inputs=inputs,outputs=x)

CodePudding user response:

So there are two different questions to answer:

  • If you want a variable number of timesteps, simply set that size in the input shape to None; that is, inputs = layers.Input(shape=(None, 27)). The RNN will then be "unrolled" dynamically at training and testing time. (There might be some performance degradation but that is the price to pay.)
  • Both during training and prediction you still need to pass all of the timesteps of the timeseries. That is because you have a Bidirectional RNN in this case. For a `one-directional' RNN, I seem to remember that there is an option that allows you to obtain make invocations stateful by explicitly passing the previous context as input and having the output state returned as an output.
  • Related