Home > Software design >  Custom Trainable Layers in Keras
Custom Trainable Layers in Keras

Time:05-02

In keras, we can use a Lambda layer to create a custom layer, like this:

def f(x):
    return x**2

model.add(Lambda(f))

Now my question is, how to make such custom function trainable? How to make this function such that it raises an input to the power w, where w is trainable. Like this:

def f(x):
    return x**w

CodePudding user response:

The problem can be solved by making a new layer via subclassing,

import tensorflow as tf
from tensorflow.keras.layers import *
from tensorflow.keras.models import *
from tensorflow.keras.optimizers import *
import numpy as np

class ScaleLayer(tf.keras.layers.Layer):
    def __init__(self):
        super(ScaleLayer, self).__init__()
        self.scale = tf.Variable(1., trainable=True)

    def call(self, inputs):
        return inputs ** self.scale

x = np.array([1,2,3,4,5,6,7,8,9,10,11,12,13]).reshape(-1,1)
y = x**3.25

l = ScaleLayer()
a1 = tf.keras.layers.Input(shape=1)
a2 = l(a1)
model = tf.keras.models.Model(a1,a2)

model.compile(optimizer=Adam(learning_rate=0.01), loss='mse')
model.fit(x,y, epochs=500, verbose=0)

print(l.weights) # This prints 3.25

More about this can be found here

  • Related