Home > Blockchain >  Error When Trying to Calculate FLOPS for Complex TF2 Keras Models
Error When Trying to Calculate FLOPS for Complex TF2 Keras Models

Time:12-05

I want to calculate the FLOPS in the ML models used. I get an error when I tried to calculate for much complex models.

I get this Error for Efficientnet Models:

ValueError: Unknown layer: FixedDropout. Please ensure this object is passed to the `custom_objects` argument. See https://www.tensorflow.org/guide/keras/save_and_serialize#registering_the_custom_object for details.
​

The function to calculate the FLOPS:

1)

def get_flops(model, batch_size=None):
    if batch_size is None:
        batch_size = 1

    real_model = tf.function(model).get_concrete_function(tf.TensorSpec([batch_size]   model.inputs[0].shape[1:], model.inputs[0].dtype))
    frozen_func, graph_def = convert_variables_to_constants_v2_as_graph(real_model)

    run_meta = tf.compat.v1.RunMetadata()
    opts = tf.compat.v1.profiler.ProfileOptionBuilder.float_operation()
    flops = tf.compat.v1.profiler.profile(graph=frozen_func.graph,
                                            run_meta=run_meta, cmd='op', options=opts)
    return flops.total_float_ops

or

2)

def get_flops(model_h5_path):
    session = tf.compat.v1.Session()
    graph = tf.compat.v1.get_default_graph()
        

    with graph.as_default():
        with session.as_default():
            model = tf.keras.models.load_model(model_h5_path)

            run_meta = tf.compat.v1.RunMetadata()
            opts = tf.compat.v1.profiler.ProfileOptionBuilder.float_operation()
        
            # We use the Keras session graph in the call to the profiler.
            flops = tf.compat.v1.profiler.profile(graph=graph,
                                                  run_meta=run_meta, cmd='op', options=opts)
        
            return flops.total_float_ops

On the contrary, I am able to calculate the FLOPS for models like Resnets, just that it is not possible for bit complex models. How can I mitigate the issue?

CodePudding user response:

In the first function, you are using the convert_variables_to_constants_v2_as_graph function from TensorFlow to convert the model to a graph. This function has a custom_objects parameter that you can use to pass the custom layers that the model uses. You can add the FixedDropout layer to this parameter to fix the error you are seeing.

Here is an example of how you could use the custom_objects parameter to fix the error:

import tensorflow as tf
from tensorflow.python.framework.convert_to_constants import convert_variables_to_constants_v2_as_graph

# Define the custom layer class
class FixedDropout(tf.keras.layers.Dropout):
  def call(self, inputs, training=None):
    return inputs

def get_flops(model, batch_size=None):
    if batch_size is None:
        batch_size = 1

    real_model = tf.function(model).get_concrete_function(tf.TensorSpec([batch_size]   model.inputs[0].shape[1:], model.inputs[0].dtype))

    # Pass the custom layer class to the custom_objects parameter
    frozen_func, graph_def = convert_variables_to_constants_v2_as_graph(real_model, custom_objects={'FixedDropout': FixedDropout})

    run_meta = tf.compat.v1.RunMetadata()
    opts = tf.compat.v1.profiler.ProfileOptionBuilder.float_operation()
    flops = tf.compat.v1.profiler.profile(graph=frozen_func.graph,
                                            run_meta=run_meta, cmd='op', options=opts)
    return flops.total_float_ops
  • Related