Still working through video tutorial https://www.youtube.com/watch?v=weQ5pShEVic&list=PLbMqOoYQ3Mxw1Sl5iAAV4SJmvnAGAhFvK&index=2 about pytorch but hit another error.
lossFunc = torch.nn.MSELoss()
for i in range(epoch):
output = net(x)
loss = lossFunc(output, y)
loss.zero_grad()
loss.backward()
for f in net.parameters():
f.data.sub_(learning_rate = f.grad.data)
print(output, loss)
Created the network, loss function and wanted to iterate before backpropogattion
but get this error:
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
/var/folders/v_/yq26pm194xj5ckqy8p_njwc00000gn/T/ipykernel_9995/2476130544.py in <module>
3 output = net(x)
4 loss = lossFunc(output, y)
----> 5 loss.zero_grad()
6 loss.backward()
7
AttributeError: 'Tensor' object has no attribute 'zero_grad'
What gives?
CodePudding user response:
You should use zero grad for your optimizer.
optimizer = torch.optim.Adam(net.parameters(), lr=0.001)
lossFunc = torch.nn.MSELoss()
for i in range(epoch):
optimizer.zero_grad()
output = net(x)
loss = lossFunc(output, y)
loss.backward()
optimizer.step()