Home > other >  Will defining a loss in metrics affect the model training?
Will defining a loss in metrics affect the model training?

Time:06-29

I want to add multiple loss values to my model, but when I add them to the loss parameter, they don't show up (it just shows as loss, not each loss value):

dice_loss = sm.losses.DiceLoss() 
focal_loss = sm.losses.BinaryFocalLoss()
total_loss = dice_loss   (1 * focal_loss)
optim = tf.keras.optimizers.Adam(LR)
loss = [total_loss, sm.losses.BinaryFocalLoss(), sm.losses.DiceLoss(),  sm.losses.JaccardLoss()]
metrics = [sm.metrics.IOUScore(threshold=0.5), sm.metrics.FScore()]

model.compile(optimizer = optim, loss=loss, metrics=metrics)

So to show each loss, I add the loss to the metrics side like this:

dice_loss = sm.losses.DiceLoss() 
focal_loss = sm.losses.BinaryFocalLoss()
total_loss = dice_loss   (1 * focal_loss)
optim = tf.keras.optimizers.Adam(LR)
loss = [total_loss]
metrics = [sm.metrics.IOUScore(threshold=0.5), sm.metrics.FScore(), sm.losses.BinaryFocalLoss(), sm.losses.DiceLoss(),  sm.losses.JaccardLoss()]

model.compile(optimizer = optim, loss=loss, metrics=metrics)

Is it okay for me to add the loss function to the metrics side? or it will be affect the training process? If it will affect the training process, is there anyway for me to show each loss value without affecting the training process?

CodePudding user response:

Yes this is fine, it will not change the training process, metrics are only used for monitoring and not for computing gradients that will train the model.

  • Related