Home > Enterprise >  Is there a way to reset the learning rate on each fold while employing the ReduceLROnPlateau callbac
Is there a way to reset the learning rate on each fold while employing the ReduceLROnPlateau callbac

Time:06-10

As the title is self-descriptive, I'm looking for a way to reset the learning rate (lr) on each fold. The ReduceLROnPlateau callback of Keras manages the lr.

CodePudding user response:

With no reproducible example I can only make a suggestion. If you take a look at the source code of ReduceLROnPlateau you can get some inspiration and create a custom callback to reset the learning rate on the beginning of training:

class ResetLR(tf.keras.callbacks.Callback):
  def on_train_begin(self, logs={}):
    default_lr = 0.1
    previous_lr = self.model.optimizer.lr.read_value()
    if previous_lr!=defaul_lr:
      print("Resetting learning rate from {} to {}".format(previous_lr, default_lr))
      self.model.optimizer.lr.assign(default_lr)

So with this callback you train using a for loop:

custom_callback = ResetLR()
for fold in folds:
  model.fit(...., callbacks=[custom_callback])

If this does not work (due to tensorflow versions) you can try assigning the default learning rate using the tf.keras.backend like so:

class ResetLR(tf.keras.callbacks.Callback):
  def on_train_begin(self, logs={}):
    default_lr = 0.1
    previous_lr = float(tf.keras.backend.get_value(self.model.optimizer.lr))
    if previous_lr!=default_lr:
      print("Resetting learning rate from {} to {}".format(previous_lr, default_lr))
      tf.keras.backend.set_value(self.model.optimizer.lr, default_lr)

Also I would suggest taking a look at this post, for more references.

CodePudding user response:

below is a custom callback that will do the job. At the start of training, the callback prompts the user to enter the value of the initial learning rate.

class INIT_LR(keras.callbacks.Callback):
    def __init__ (self, model): # initialization of the callback
        super(INIT_LR, self).__init__()
        self.model=model
    def on_train_begin(self, logs=None): # this runs on the beginning of training
        print('Enter initial learning rate below')
        lr=input('')        
        tf.keras.backend.set_value(self.model.optimizer.lr, float(lr)) # set the learning rate in the optimizer
        lr=float(tf.keras.backend.get_value(self.model.optimizer.lr)) # get the current learning rate to insure it is set
        print('Optimizer learning rate set to ', lr)

in model.fit set the parameter

callbacks = [INIT_LR(model), rlronp]

Note: model is the name of your compiled model, and rlronp is the name of your ReduceLROnPlateau callback. When you run model.fit you will be prompted with

Enter initial learning rate below # printed by the callback
.001  # user entered initial learning rate
Optimizer learning rate set to  0.0010000000474974513 # printed by the callback
  • Related