Home > Enterprise >  Performing Differentiation wrt input within a keras model for use in loss
Performing Differentiation wrt input within a keras model for use in loss

Time:02-16

Is there any layer in keras which calculates the derivative wrt input? For example if x is input, the first layer is say f(x), then the next layer's output should be f'(x). There are multiple question here about this topic but all of them involve computation of derivative outside the model. In essence, I want to create a neural network whose loss function involves both the jacobian and hessians wrt the inputs.

I've tried the following

import keras.backend as K

def create_model():

    x = keras.Input(shape = (10,))
    layer = Dense(1, activation = "sigmoid")
    output = layer(x)

    jac = K.gradients(output, x)
    
    model = keras.Model(inputs=x, outputs=jac)
    
    return model

model = create_model()
X = np.random.uniform(size = (3, 10))

This is gives the error tf.gradients is not supported when eager execution is enabled. Use tf.GradientTape instead.

So I tried using that

def create_model2():
    with tf.GradientTape() as tape:
        x = keras.Input(shape = (10,))
        layer = Dense(1, activation = "sigmoid")
        output = layer(x)

    jac = tape.gradient(output, x)
    
    model = keras.Model(inputs=x, outputs=jac)
    
    return model

model = create_model2()
X = np.random.uniform(size = (3, 10))

but this tells me 'KerasTensor' object has no attribute '_id'

Both these methods work fine outside the model. My end goal is to use the Jacobian and Hessian in the loss function, so alternative approaches would also be appreciated

CodePudding user response:

Not sure what exactly you want to do, but maybe try a custom Keras layer with tf.gradients:

import tensorflow as tf
tf.random.set_seed(111)

class GradientLayer(tf.keras.layers.Layer):
  def __init__(self):
    super(GradientLayer, self).__init__()
    self.dense = tf.keras.layers.Dense(1, activation = "sigmoid")
  
  @tf.function
  def call(self, inputs):
    outputs = self.dense(inputs)
    return tf.gradients(outputs, inputs)


def create_model2():
    gradient_layer = GradientLayer()
    inputs = tf.keras.layers.Input(shape = (10,))
    outputs = gradient_layer(inputs)    
    model = tf.keras.Model(inputs=inputs, outputs=outputs)
    
    return model

model = create_model2()
X = tf.random.uniform((3, 10))
print(model(X))
tf.Tensor(
[[-0.07935508 -0.12471244 -0.0702782  -0.06729251  0.14465885 -0.0818079
  -0.08996294  0.07622238  0.11422144 -0.08126545]
 [-0.08666676 -0.13620329 -0.07675356 -0.07349276  0.15798753 -0.08934557
  -0.09825202  0.08324542  0.12474566 -0.08875315]
 [-0.08661086 -0.13611545 -0.07670406 -0.07344536  0.15788564 -0.08928795
  -0.09818865  0.08319173  0.12466521 -0.08869591]], shape=(3, 10), dtype=float32)
  • Related