I wrote a layer, which does nothing
class Fractal2D(tf.keras.layers.Layer):
def __init__(self, kernel_size_range):
super(Fractal2D, self).__init__()
self.kernel_size_range = kernel_size_range
def build(self, inputs):
print(f'build executes eagerly: {tf.executing_eagerly()}')
return inputs
def call(self, inputs):
print(f'call executes eagerly: {tf.executing_eagerly()}')
return inputs
and made a model
model = tf.keras.Sequential([
tf.keras.layers.InputLayer(input_shape=(224, 224, 3), batch_size=32),
Fractal2D(kernel_size_range=(3, 41)),
hub.KerasLayer("https://tfhub.dev/google/tf2-preview/mobilenet_v2/feature_vector/4", output_shape=[1280],
trainable=False),
tf.keras.layers.Dense(DIAGNOSIS_NUMBER, activation='softmax')
])
the output from the cell is
build executes eagerly: True
call executes eagerly: False
When I train the model
model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
model.fit(training_set, validation_data=validation_set, epochs=20)
I get
Epoch 1/20
call executes eagerly: False
call executes eagerly: False
Questions:
- Why build and call method are executed, when the model is instantiated?
- Why call method is NOT executed eagerly, if the eager execution is the default method of execution?
CodePudding user response:
The call
method of the custom layer is automatically decorated with @tf.function
, which essentially creates a dataflow graph on the first call and then executes this graph on all subsequent calls. Why is this relevant to your question? Because according to the docs on tf.executing_eagerly()
:
Eager execution is enabled by default and this API returns True in most of cases. However, this API might return False in the following use cases.
- Executing inside tf.function, unless under tf.init_scope or tf.config.run_functions_eagerly(True) is previously called.
- Executing inside a transformation function for tf.dataset.
- tf.compat.v1.disable_eager_execution() is called.
So let's try see what happens when using tf.init_scope
:
import tensorflow_hub as hub
import tensorflow as tf
class Fractal2D(tf.keras.layers.Layer):
def __init__(self, kernel_size_range):
super(Fractal2D, self).__init__()
self.kernel_size_range = kernel_size_range
def build(self, inputs):
print(f'build executes eagerly: {tf.executing_eagerly()}')
return inputs
def call(self, inputs):
with tf.init_scope():
print(f'call executes eagerly: {tf.executing_eagerly()}')
return inputs
model = tf.keras.Sequential([
tf.keras.layers.InputLayer(input_shape=(224, 224, 3), batch_size=1),
Fractal2D(kernel_size_range=(3, 41)),
hub.KerasLayer("https://tfhub.dev/google/tf2-preview/mobilenet_v2/feature_vector/4", output_shape=[1280],
trainable=False),
tf.keras.layers.Dense(1, activation='sigmoid')
])
training_set = tf.random.normal((1, 224, 224, 3))
model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
model.fit(training_set, tf.random.normal((1, 1)), epochs=2)
build executes eagerly: True
call executes eagerly: True
Epoch 1/2
call executes eagerly: True
call executes eagerly: True
1/1 [==============================] - 4s 4s/step - loss: 0.2856 - accuracy: 0.0000e 00
Epoch 2/2
1/1 [==============================] - 0s 36ms/step - loss: 0.1641 - accuracy: 0.0000e 00
<keras.callbacks.History at 0x7f8836515710>
Seems to be consistent with the docs.
CodePudding user response:
I think you can run eager mode even if you use a custom layer. Your model runs in graph mode because of the 'model.fit()' method, to run under eager mode you have to write your own training loop from scratch, you can use the GradientTape. [1]: https://www.tensorflow.org/guide/keras/customizing_what_happens_in_fit