Home > Blockchain >  Reset weights to last epoch if loss value has increased Keras
Reset weights to last epoch if loss value has increased Keras

Time:05-03

I am working on my ANN in Keras that is used with an imbalanced binary classification dataset and I have just set up a custom learning rate that checks at the start of each epoch what the loss value was compared to last epoch. If it's smaller I increase learning rate and if not I decrease learning rate and I want to reset the weights to the same as last epoch, how do I do this?

I have found something like

model.layers[0].get_weights() 

Will this give me the weights? How can I then save them to my callback and set them if this condition is met?

class CustomLearningRateScheduler(keras.callbacks.Callback):

    def __init__(self):
        super(CustomLearningRateScheduler, self).__init__()
        self.lastVal = 0
        self.learning_rate = 10
        self.last_iteration_weights = []

    def on_train_begin(self, logs={}):
        self.errors = []

    def on_epoch_start(self, epoch):
        self.weights = self.model.layers[0].get_weights()

    def on_epoch_end(self, epoch, logs={}):
        if not hasattr(self.model.optimizer, "lr"):
            raise ValueError('Optimizer must have a "lr" attribute.')
        # Get the current learning rate from model's optimizer.
        lr = float(tf.keras.backend.get_value(self.model.optimizer.learning_rate))
 
        val = logs.get('loss')

        if(float(val) > float(self.lastVal)):
            self.learning_rate = lr * 0.95
            tf.keras.backend.set_value(self.model.optimizer.lr, self.learning_rate)
            
        else:
            self.learning_rate = lr * 1.01
            tf.keras.backend.set_value(self.model.optimizer.lr, self.learning_rate)
        self.lastVal = val
        self.errors.append(self.lastVal)

        print("\nEpoch d: Learning rate is %f ." % (epoch, self.learning_rate))

This class is called in:

model_p.fit(X, y, epochs=EPOCH_SIZE, batch_size=BATCH_SIZE, verbose=1, shuffle=True, callbacks=[CustomLearningRateScheduler()])

CodePudding user response:

I did this now:

class CustomLearningRateScheduler(keras.callbacks.Callback):

    def __init__(self):
        super(CustomLearningRateScheduler, self).__init__()
        self.lastVal = 0
        self.learning_rate = 10
        self.last_iteration_weights = []

    def on_train_begin(self, logs={}):
        self.errors = []
        self.resetWeights = False
        self.weights = [0]*len(self.model.layers)

    def on_epoch_start(self, epoch):
        if(self.resetWeights == True):
            self.model.layers.set_weights(self.weights)
            self.resetWeights = False

    def on_epoch_end(self, epoch, logs={}):
        if not hasattr(self.model.optimizer, "lr"):
            raise ValueError('Optimizer must have a "lr" attribute.')
        # Get the current learning rate from model's optimizer.
        x = 0
        for layer_i in range(len(self.model.layers)):
            self.weights[x] = self.model.layers[layer_i].get_weights()
            x  = 1
        lr = float(tf.keras.backend.get_value(self.model.optimizer.learning_rate))
 
        val = logs.get('loss')

        if(float(val) > float(self.lastVal)):
            self.learning_rate = lr * 0.95
            self.resetWeights = True
            tf.keras.backend.set_value(self.model.optimizer.lr, self.learning_rate)
            
        else:
            self.resetWeights = False
            self.learning_rate = lr * 1.01
            tf.keras.backend.set_value(self.model.optimizer.lr, self.learning_rate)
        self.lastVal = val
        self.errors.append(self.lastVal)

        print("\nEpoch d: Learning rate is %f ." % (epoch, self.learning_rate))

Would this work? Want to reset all weights!

CodePudding user response:

I don't think it's a good idea to update the weights only if the loss decreases, in the stochastic gradient approach. In particular, the risk is to get stuck in a local minimum early. By the way, your initial learning rate is very high compared to standard values (1e-3).

CodePudding user response:

Not sure where exactly you want to (re)set the weights, but you should be able to use self.model.layers[0].set_weights(self.weights) to set the layer's weights from the previous epoch based on some condition. The variable self.weights just has to be available at train begin already:

import tensorflow as tf

class CustomLearningRateScheduler(tf.keras.callbacks.Callback):

    def __init__(self):
        super(CustomLearningRateScheduler, self).__init__()
        self.lastVal = 0
        self.learning_rate = 10
        self.last_iteration_weights = []

    def on_train_begin(self, logs={}):
        self.errors = []
        self.weights = self.model.layers[0].get_weights()

    def on_epoch_start(self, epoch):
        self.weights = self.model.layers[0].get_weights()

    def on_epoch_end(self, epoch, logs={}):
        if not hasattr(self.model.optimizer, "lr"):
            raise ValueError('Optimizer must have a "lr" attribute.')
        # Get the current learning rate from model's optimizer.
        lr = float(tf.keras.backend.get_value(self.model.optimizer.learning_rate))
 
        val = logs.get('loss')

        if(float(val) > float(self.lastVal)):
            self.learning_rate = lr * 0.95
            tf.keras.backend.set_value(self.model.optimizer.lr, self.learning_rate)
            self.model.layers[0].set_weights(self.weights)
        else:
            self.learning_rate = lr * 1.01
            tf.keras.backend.set_value(self.model.optimizer.lr, self.learning_rate)
        self.lastVal = val
        self.errors.append(self.lastVal)

        print("\nEpoch d: Learning rate is %f ." % (epoch, self.learning_rate))

model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(10, activation='relu', input_shape=(5,)))
model.add(tf.keras.layers.Dense(1, activation='linear'))

model.compile(optimizer='adam', loss='mse')
X = tf.random.normal((500, 5))
y = tf.random.normal((500, 1))
model.fit(X, y, epochs=5, batch_size=32, verbose=1, shuffle=True, callbacks=[CustomLearningRateScheduler()])

Also, I think you can work directly with self.model.get_weights() and self.model.set_weights, if you want to reset all weights instead of individual weights.

  • Related