I am working on a problem with little data. I am augmenting the training set, i.e. I am rotating my images by up to 12 degrees in both directions:
https://docs.scipy.org/doc/scipy/reference/generated/scipy.ndimage.rotate.html
Since I only have my work-PC to work with (only i5 CPU) my batch_size is small. So small in fact that I only process one image, with its rotations, per batch (ofc. only using tiny learning_rate).
What I need to know is, if the dropout gets updated per picture or per batch. Because if it is per batch, I would need to change tactics.
Thanks!!
CodePudding user response:
Dropout is updated per batch_size you define in model.fit(), if unspecified the default is 32.
The Dropout layer randomly sets input units to 0 with a frequency of rate at each step during training time
A training step is one gradient update. In one step batch_size examples are processed