I'm using a Keras Lambda Layer to make some operations with a tensor of trainable weights (or at least it should); to do that I choosed a tf.Variable as parameter but, despite trainable=True, the summary shows 0 trainable parameters.
weights = tf.Variable(initial_value=tf.random.normal((300,)), trainable=True)
custom_layer = keras.layers.Lambda(custom_func)((input_layer, weights))
Independently from trainable=True, weights remain non-trainable. An alternative option would be using a layer like:
weights = Dense(300, activation='linear', use_bias=False)
In this case I have troubles in the custom_func due to tf.math.multiply which does not accept, at least according to my experiments, the Dense layer params in any way (I tried .get_weights() and .variables).
Every solution to obtain a trainable weight tensor is very welcome, thank you in advance.
CodePudding user response:
Using variables with lambda functions can lead to bugs as custom_layer
does not directly track weights
so the tensor will not appear in trainable weights.
This can be solved by subclassing Layer
class as follows:
class custom_layer(tf.keras.layers.Layer):
def __init__(self):
super(custom_layer, self).__init__()
self.weights = tf.Variable(...) #define weights here
def call(self, inputs):
return custom_func(..)