Home > Enterprise >  For loop not stopping for my neural network
For loop not stopping for my neural network

Time:10-05

I'm trying to make a CNN and using a training module to train it. I'd like to specify the number of iterations it makes but am finding that it runs continuously.

Is anyone able to help me with this?

def train(model, epochs=10):

optimiser = torch.optim.SGD(model.parameters(), lr=0.001)

writer = SummaryWriter()

batch_idx = 0
loss_total = 0
epoch = 0

for epoch in range(epochs):
    print('range:', range(epochs))
    for batch in train_loader:
        features, labels = batch
        prediction = model(features)

        # cf = confusion_matrix(labels, prediction)

        loss = F.cross_entropy(prediction, labels) # Loss model changes label size 
        loss_total  = loss.item()
        loss.backward()
        print('loss:', loss.item())
        optimiser.step() 
        optimiser.zero_grad()
        writer.add_scalar('Loss', loss.item(), batch_idx)
        batch_idx  = 1
        print('epoch', epoch)
        epoch  = 1 # why does this not stop???
    print('Total loss:', loss_total/batch_idx)

If it helps you can also find this on my GitHub: https://github.com/amosmike/facebook-market-search/blob/master/CNN.py

Thank you for any help you can provide

CodePudding user response:

you shouldn't be incrementing your epoch variable at all. The epoch is being pulled from the range. Secondly, you're inside the batch loop. You probably don't want to bail after N batches.

CodePudding user response:

Problem is you are incrementing epoch in batch.

epoch  = 1 # why does this not stop???

You can take a look at tutorial for writing training look from scratch.

  • Related