Home > Back-end >  How do I add a layer to perform an elementwise product with constants using the Keras API?
How do I add a layer to perform an elementwise product with constants using the Keras API?

Time:07-20

I have a simple fully-connected feed-forward neural network built using the Keras API. The network has one input, a single hidden layer with two neurons, and an output of size three.

from keras.models import Sequential
from keras.layers import Dense

# construct network
model = Sequential()
model.add(Dense(2, input_dim=1, activation='relu'))
model.add(Dense(3, activation='linear'))

Let me denote the activations of the final layer - the output of the network - by a_i. What I would now like to do is take a linear combination of 3 (constant) matrices T_i using a_i, thus:

q = a_1*T_1  a_2*T_2  a_3*T_3

I want this quantity, q to be the output of the network (i.e. the quantity used in the loss) instead. How can this be done in Keras? In other words, how do I manually add a layer at the end that performs the elementwise product and sum above, and makes the resulting quantity the output of the network?

CodePudding user response:

you could use a lambda layer, that gets in input the tensor of size (3) and does the moltiplication of the 3 numbers, and you get in output a tensor of size (1).

This is an example of lambda layer in keras:

def normalizer(x):
  a = x[:, :, :, :, 1] # input
  b = x[:, :, :, :, 2] # pred
  asum = tf.keras.backend.sum(a)
  bsum = tf.keras.backend.sum(b)
  ratio = tf.math.divide(asum, bsum)
  ratio = tf.cast(ratio, dtype=tf.float32) 
  return tf.multiply(b, ratio)

this layer normalizes the prediction based on the input, you can do something similar

you could try to implement something like this:

def multiplier(x):
  a = x[:, 1] # first value
  b = x[:, 2] # second value
  c = x[:, 3] # third value
  ab = tf.multiply(a, b)
  return tf.multiply(ab, c)

then you just put it in your model like a normal layer

  • Related