If I add, e.g. kernel_regularizer=tf.keras.regularizers.L1(0.01)
to a layer, do I need to add something to my loss description when I compile, or is it automatically added to my normal loss?
CodePudding user response:
Using the tf.keras.regularizers.L1(0.01)
will automatically add a penalty to your loss function. You can observe the changes in the loss function with and without the penalty using this simple example:
import tensorflow as tf
tf.random.set_seed(1)
x_input = tf.keras.layers.Input((1,))
x = tf.keras.layers.Dense(3, kernel_regularizer=tf.keras.regularizers.L1(0.01))(x_input)
x_output = tf.keras.layers.Dense(1, activation='sigmoid')(x)
model = tf.keras.Model(x_input, x_output)
model.compile(optimizer='adam', loss=tf.keras.losses.BinaryCrossentropy())
x = tf.random.normal((1, 1))
y = tf.random.uniform((1, 1), maxval=2, dtype=tf.int32)
model.fit(x, y, epochs=1)
If you were to use a custom training loop, you would have to manually add the additional penalties you have defined in certain layers to your loss, as shown here.