Home > Back-end >  Incompatible shapes Mean Squared Error Keras
Incompatible shapes Mean Squared Error Keras

Time:12-04

I want to train a RNN with Keras, the shape for the X is (4413, 71, 19) while for y is (4413,2)

Code

model = Sequential()
model.add(LSTM(128, return_sequences=True, input_shape=(None,19)))
model.add(Dropout(.2))
model.add(BatchNormalization())

model.add(LSTM(128, return_sequences=True, input_shape=(None,19)))
model.add(Dropout(.2))
model.add(BatchNormalization())

model.add(LSTM(128, return_sequences=True, input_shape=(None,19)))
model.add(Dropout(.2))
model.add(BatchNormalization())

model.add(Dense(32, activation='relu'))
model.add(Dropout(.2))

model.add(Dense(2, activation='softmax'))

model.compile(optimizer='adam', loss='mean_squared_error', metrics=['mean_squared_error'])

When I fit the model I got this error, seems that the loss function can't fit with this kind of data

Incompatible shapes: [64,2] vs. [64,71,2]
     [[{{node mean_squared_error/SquaredDifference}}]] [Op:__inference_train_function_157671]

CodePudding user response:

Try setting the parameter return_sequences of the last LSTM layer to False:

model = Sequential()
model.add(LSTM(128, return_sequences=True, input_shape=(None,19)))
model.add(Dropout(.2))
model.add(BatchNormalization())

model.add(LSTM(128, return_sequences=True))
model.add(Dropout(.2))
model.add(BatchNormalization())

model.add(LSTM(128, return_sequences=False))
model.add(Dropout(.2))
model.add(BatchNormalization())

model.add(Dense(32, activation='relu'))
model.add(Dropout(.2))

model.add(Dense(2, activation='linear'))

model.compile(optimizer='adam', loss='mean_squared_error', metrics=['mean_squared_error'])

I have also changed the activation function in the output layer to linear, since a softmax layer does not make much sense in your case. Also refer to this answer.

  • Related