Home > Mobile >  loss.backward() no grad in pytorch NN
loss.backward() no grad in pytorch NN

Time:11-22

The code gives an error in loss.backward() Error is: untimeError: element 0 of tensors does not require grad and does not have a grad_fn

for epoch in range(N_EPOCHS):
    model.train()
    for i,(im1, im2, labels) in enumerate(train_dl):
        i1 = torch.flatten(im1,1)
        i2 = torch.flatten(im2,1)
        inp = torch.cat([i1,i2],1)
        
        b_x = Variable(inp) # batch x
        b_y = Variable(labels) # batch y
        y_ = model(b_x).squeeze()
        y_ = (y_>0.5).float()
        
        print(y_)
        print(l)
        loss = criterion(y_,b_y)
        print(loss.item())
        loss.backward()
        optimizer.step()

CodePudding user response:

y_ = (y_>0.5).float()

has a zero gradient, intuitively because "tiny changes in the argument lead to absolutely no change in the value (imagine that y_ changes by tiny epsilon, it does not affect value of y_.

CodePudding user response:

With additional info given by OP in the comment, the correct approach here is just removing the line

y_ = (y_>0.5).float()
  • Related