Home > Blockchain >  Machine Learning Epos [closed]
Machine Learning Epos [closed]

Time:09-28

why is an eposed of 20 often used in Machine Learning? and why don´t we use 100 epos or any higher?

I think i could also use "epos = 20" if i have enough ressources, or not? does it have any negative effects?

model.fit(train_padded, train_labels, epochs=20, validation_data=(val_padded, val_labels), verbose=2)


Epoch 1/30
191/191 - 2s - loss: 0.0280 - accuracy: 0.9852 - val_loss: 1.7940 - val_accuracy: 0.7183
Epoch 2/30
191/191 - 2s - loss: 0.0301 - accuracy: 0.9837 - val_loss: 1.1709 - val_accuracy: 0.7275
Epoch 3/30
191/191 - 2s - loss: 0.0418 - accuracy: 0.9790 - val_loss: 1.0723 - val_accuracy: 0.7360
Epoch 4/30
191/191 - 2s - loss: 0.0344 - accuracy: 0.9837 - val_loss: 1.4722 - val_accuracy: 0.7321
Epoch 5/30
191/191 - 2s - loss: 0.0315 - accuracy: 0.9839 - val_loss: 1.8880 - val_accuracy: 0.7354
Epoch 6/30
191/191 - 2s - loss: 0.0320 - accuracy: 0.9862 - val_loss: 1.4486 - val_accuracy: 0.7347
Epoch 7/30
191/191 - 2s - loss: 0.0283 - accuracy: 0.9860 - val_loss: 1.6277 - val_accuracy: 0.7229
Epoch 8/30
191/191 - 2s - loss: 0.0274 - accuracy: 0.9857 - val_loss: 1.5423 - val_accuracy: 0.7446
Epoch 9/30
191/191 - 2s - loss: 0.0277 - accuracy: 0.9859 - val_loss: 1.9696 - val_accuracy: 0.7367
Epoch 10/30
191/191 - 2s - loss: 0.0270 - accuracy: 0.9862 - val_loss: 2.0356 - val_accuracy: 0.7328
Epoch 11/30
191/191 - 2s - loss: 0.0258 - accuracy: 0.9862 - val_loss: 2.2467 - val_accuracy: 0.7400
Epoch 12/30
191/191 - 2s - loss: 0.0336 - accuracy: 0.9844 - val_loss: 1.5076 - val_accuracy: 0.7374
Epoch 13/30
191/191 - 2s - loss: 0.0325 - accuracy: 0.9833 - val_loss: 1.7409 - val_accuracy: 0.7328
Epoch 14/30
191/191 - 2s - loss: 0.0309 - accuracy: 0.9854 - val_loss: 1.5070 - val_accuracy: 0.7328
Epoch 15/30
191/191 - 2s - loss: 0.0285 - accuracy: 0.9851 - val_loss: 1.9122 - val_accuracy: 0.7347
Epoch 16/30
191/191 - 2s - loss: 0.0274 - accuracy: 0.9860 - val_loss: 1.8435 - val_accuracy: 0.7452
Epoch 17/30
191/191 - 2s - loss: 0.0279 - accuracy: 0.9864 - val_loss: 1.5404 - val_accuracy: 0.7360
Epoch 18/30
191/191 - 2s - loss: 0.0256 - accuracy: 0.9867 - val_loss: 1.9849 - val_accuracy: 0.7328
Epoch 19/30
191/191 - 2s - loss: 0.0260 - accuracy: 0.9860 - val_loss: 1.9083 - val_accuracy: 0.7347
Epoch 20/30

CodePudding user response:

There is absolutely no limit to number of epochs, where did you read that, please post link or reference if you have one.

If not, yeah of course go ahead and try as may as you want.

The problem with large number of epochs is that they increase your computational requirements and might lead to over-fitting on training dataset.

In order to accelerate model training usually you want to terminate you training not by reaching a maximum number of epochs but by achieving a performance criteria. For example if the rate that you error decreases slows down below a limit you stop the training.

Read some more information here if interested: Epoch in Neural Networks

  • Related