I'm trying to create a network to predict a time series with arbitrary size (i.e. time_steps = None). I'm testing different topologies, but I wanted a 7 neuron input layer (time series in the input has 7 dimensions), and a one neuron output layer (value to forecast is one-dimensional), between them, I'm testing with several a variable number of LSTM layers, each with a variable number of neurons. I want to use CuDNN (just to be faster), so there are some restrictions to the parameters I use. Sometimes I get this strange error:
ValueError: Input 0 of layer lstm_1 is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: (None, 5)
The minimal code to reproduce the problem is given below:
import tensorflow as tf
rnn = tf.keras.models.Sequential()
rnn.add(tf.keras.layers.Input(shape=(1, 7)))
rnn.add(tf.keras.layers.LSTM(5, activation="tanh", return_sequences=False, unroll=False,
recurrent_activation='sigmoid', use_bias=True, time_major=True,
recurrent_dropout=0, stateful=False, input_shape=(None, 7)))
rnn.add(tf.keras.layers.LSTM(5, activation="tanh", return_sequences=False, unroll=False,
recurrent_activation='sigmoid', use_bias=True, time_major=True,
recurrent_dropout=0, stateful=False, input_shape=(None, 5)))
rnn.add(tf.keras.layers.Dense(1, activation="linear"))
Why does this problem happen? The exact same message is shown if I change the input_shape
parameter to (1, None, 5)
.
CodePudding user response:
So, there are a couple of things going on here. The LSTM layer expects three dimensional data as an input. In your code you are defining the input to be a two-dimensional array of shape (None, 7)
, knowing that None
stands for the batch size. It would suffice to change the input_shape argument to (None, 1, 7)
, but note that it might imply that you need to change your input data a little bit.
Also note that the Input
layer is designed to be used with the Functional API, not with the Sequential API as you have used in your code.
CodePudding user response:
Change return_sequences
to be equal True
in the first LSTM layer.