I am trying to implement a custom loss function for Keras LSTM, which would represent mask_MAE.
def mask_MAE (y_true, y_pred, mask):# mask = 0 or 1
mae = K.abs(y_pred - y_true) * mask
return K.sum(mae)/K.sum(mask)
CodePudding user response:
I found an answer to my question. I am working with LSTM and 80 is num_steps
def GBVPP_loss(y_true, y_pred, cols = 80):
u_out = y_true[:, cols: ]
y = y_true[:, :cols ]
w = 1 - u_out
mae = w * tf.abs(y - y_pred)
return tf.reduce_sum(mae, axis=-1) / tf.reduce_sum(w, axis=-1)
...
history = model.fit(X_train, np.append(y_train, u_out_train, axis =1),
validation_data=(X_valid, np.append(y_valid, u_out_valid, axis =1)),
epochs=EPOCH, batch_size=BATCH_SIZE,
verbose=0,
callbacks=[lr])
CodePudding user response:
A custom keras
loss function can have only two parameters- y_true
& y_pred
.
So, you cannot use that mask
parameter as you have done in your code.