Home > Net >  In keras, how to control the number of mini-batches drawn during one epoch
In keras, how to control the number of mini-batches drawn during one epoch

Time:12-12

I read Keras's official manual and a few examples such as this one. I understand that we can specify the size of a mini-batch using the batch_size parameter and specify the number of epochs using the epochs parameter.

But how can we decide how many mini-batches are there within one epoch? In scikit-learn, there are a few options to (indirectly) control this, such as max_iter, tol, etc. But I failed to find something similar in Keras

CodePudding user response:

the way mini_batches are calculated in keras depends upon the size of your training data.

In the example that you have posted you can see that the validation_split=0.2 that means It is Splitting the data into 2 parts i.e training and validation

image_dataset_from_directory is responsible for sending your data as mini_batches.

Total number of data points 23410

training_data 18728 (80% of 23410)

validation_data 4682(20% of 23410)

batch_size 32

therefore the number of steps or the number of minibatches that are calculated in the source code of image_dataset_from_directory is training_data/batch_size

steps_per_epoch is 585 i.e there are 585 mini batches for that epoch which means 32 images are selected from the data 585 times

  • Related