Home > Mobile >  torch.nn.BCEloss() and torch.nn.functional.binary_cross_entropy
torch.nn.BCEloss() and torch.nn.functional.binary_cross_entropy

Time:05-09

What is the basic difference between these two loss functions? I have already tried using both the loss functions.

CodePudding user response:

The difference is that nn.BCEloss and F.binary_cross_entropy are two PyTorch interfaces to the same operations.

  • The former, torch.nn.BCELoss, is a class and inherits from nn.Module which makes it handy to be used in a two-step fashion, as you would always do in OOP (Object Oriented Programming): initialize then use. Initialization handles parameters and attributes initialization as the name implies which is quite useful when using stateful operators such as parametrized layers and the kind. This is the way to go when implementing classes of your own, for example:

    class Trainer():
        def __init__(self, model):
            self.model = model
            self.loss = nn.BCEloss()
    
        def __call__(self, x, y)
            y_hat = self.model(x)
            loss = self.loss(y_hat, y)
            return loss
    
  • On the other hand, the later, torch.nn.functional.binary_cross_entropy, is the functional interface. It is actually the underlying operator used by nn.BCELoss, as you can see at this line. You can use this interface but this can become cumbersome when using stateful operators. In this particular case, the binary cross-entropy loss does not have parameters (in the most general case), so you could do:

    class Trainer():
        def __init__(self, model):
            self.model = model
    
        def __call__(self, x, y)
            y_hat = self.model(x)
            loss = F.binary_cross_entropy(y_hat, y)
            return loss
    

CodePudding user response:

BCEloss is the Binary_Cross_Entropy loss.
torch.nn.functional.binary_cross_entropy calculates the actual loss inside the torch.nn.BCEloss()

  • Related