Home > Enterprise >  Keras Custom loss Penalize more when actual and prediction are on opposite sides of Zero
Keras Custom loss Penalize more when actual and prediction are on opposite sides of Zero

Time:11-02

I'm training a model to predict percentage change in prices. Both MSE and RMSE are giving me up to 99% accuracy but when I check how often both actual and prediction are pointing in the same direction ((actual >0 and pred > 0) or (actual < 0 and pred < 0)), I get about 49%.

Please how do I define a custom loss that penalizes opposite directions very heavily. I'd also like to add a slight penalty for when the predictions exceeds the actual in a given direction.

So

  • actual = 0.1 and pred = -0.05 should be penalized a lot more than actual = 0.1 and pred = 0.05,
  • and actual = 0.1 and pred = 0.15 slightly more penalty than actual = 0.1 and pred = 0.05

CodePudding user response:

I will leave it up to you to define your exact logic, but here is how you can implement what you want with tf.cond:

import tensorflow as tf

y_true = [[0.1]]
y_pred = [[0.05]]
mse = tf.keras.losses.MeanSquaredError()

def custom_loss(y_true, y_pred):
  penalty = 20

  # actual = 0.1 and pred = -0.05 should be penalized a lot more than actual = 0.1 and pred = 0.05
  loss = tf.cond(tf.logical_and(tf.greater(y_true, 0.0), tf.less(y_pred, 0.0)),
                   lambda: mse(y_true, y_pred) * penalty,
                   lambda: mse(y_true, y_pred) * penalty / 4)
  
  #actual = 0.1 and pred = 0.15 slightly more penalty than actual = 0.1 and pred = 0.05
  loss = tf.cond(tf.greater(y_pred, y_true),
                   lambda: loss * penalty / 2,
                   lambda: loss * penalty / 3)
  return loss 
  
print(custom_loss(y_true, y_pred))
  • Related