I have been trying to add a tfa metric to my model compile to be tracked throughout the training. However, when I add the R2 metric, I get the following error. I thought y_shape=(1,)
would fix this, however it did not.
ValueError: Dimension 0 in both shapes must be equal, but are 1 and 5. Shapes are [1] and [5]. for '{{node AssignAddVariableOp_8}} = AssignAddVariableOp[dtype=DT_FLOAT](AssignAddVariableOp_8/resource, Sum_6)' with input shapes: [], [5].
My code is shown below:
model = Sequential()
model.add(Input(shape=(4,)))
model.add(Normalization())
model.add(Dense(5, activation="relu", kernel_regularizer=l2(l2=1e-2)))
print(model.summary())
opt = Adam(learning_rate = 1e-2)
model.compile(loss="mean_squared_error", optimizer=tf.keras.optimizers.Adam(learning_rate=1e-2), metrics=[MeanSquaredError(name="mse"), MeanAbsoluteError(name="mae"), tfa.metrics.RSquare(name="R2", y_shape=(1,))])
history = model.fit(x = training_x,
y = training_y,
epochs = 10,
batch_size = 64,
validation_data = (validation_x, validation_y)
)
Any help is greatly appreciated! Note, I also tried changing the y_shape to (5,), but then I get the error that the dimensions are not equal, but are 5 and 1...
CodePudding user response:
You need to add an output layer to your model like the below:
model.add(Dense(1))
then your model will be like below:
model = Sequential()
model.add(Input(shape=(4,)))
model.add(Normalization())
model.add(Dense(5, activation="relu", kernel_regularizer=regularizers.l2(l2=1e-2)))
model.add(Dense(1))
print(model.summary())
Output:
Model: "sequential_10"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
normalization_10 (Normaliza (None, 4) 9
tion)
dense_12 (Dense) (None, 5) 25
dense_13 (Dense) (None, 1) 6
=================================================================
Total params: 40
Trainable params: 31
Non-trainable params: 9