Home > Enterprise >  Pytorch - Optimizer is not updating its specified parameter
Pytorch - Optimizer is not updating its specified parameter

Time:12-03

I'm trying to implement CLIP-based style transfer. The full code is here

For some unknown reason optimizer doesn't change the weights of the latent tensor. I can confirm that the values are equal before and after the iteration steps. I've also made sure that requires_grad is True and tried various loss functions and optimizers.

Any idea why it doesn't work?

CodePudding user response:

I see some problems with your code.

The optimizer takes in parameters. Parameters are supposed to be leaf nodes in your computation graph. In your case, you tell the optimizer to use latent as the parameter, but it must have complained as latent is the result of some computations.

So you detached latent, now latent becomes a leaf node. But when you detach the latent, the computation graph is no longer there, creating a new latent variable.

Also, to optimize a parameter, the loss should be a function of that parameter. I am not able to see if you are using latent in your loss function computation. So that can be another issue.

CodePudding user response:

I think I've found the issue. On line 86, where I compute one-hot vector from latent, in order to decode it and pass it to CLIP, the graph would break. vae_make_onehot returns a leaf tensor

  • Related