Home > Mobile >  How to save dummy tensorflow model for serving?
How to save dummy tensorflow model for serving?

Time:09-30

I'd like to save dummy tensorflow model to use it later with tensorflow serving. I've tried to prepare such model using following snippet:

import tensorflow as tf

input0 = tf.keras.Input(shape=[2], name="input_0", dtype="int32")
input1 = tf.keras.Input(shape=[2], name="input_1", dtype="int32")
output = tf.keras.layers.Add()([input0, input1])

model = tf.keras.Model(inputs=[input0, input1], outputs=output)

predict_function = tf.function(
    func=model.call,
    input_signature=[input0.type_spec, input1.type_spec],
)

signatures = {
    "predict": predict_function.get_concrete_function(
        [input0.get_shape(), input1.get_shape()],
    ),
}

model.save(
    filepath="some/dummy/path",
    signatures=signatures,
)

Running the code to save the model ends with following error:

AssertionError: Could not compute output KerasTensor(type_spec=TensorSpec(shape=(None, 2), dtype=tf.int32, name=None), name='add/add:0', description="created by layer 'add'")

What should I do to be able to save dummy model with signatures to use it later with tensorflow serving?

CodePudding user response:

According to the model.call documentation, you should always use __call__:

call

This method should not be called directly. It is only meant to be overridden when subclassing tf.keras.Model. To call a model on an input, always use the __call__() method, i.e. model(inputs), which relies on the underlying call() method.

Then, I am not sure how several inputs in a list should be handled, so I would just use a lambda:

func = lambda x, y: model.__call__([x, y]),

When I changed the signatures such that they match, the model could be saved. Don't know about tensorflow serving.

import tensorflow as tf

input0 = tf.keras.Input(shape=[2], name="input_0", dtype="int32")
input1 = tf.keras.Input(shape=[2], name="input_1", dtype="int32")
output = tf.keras.layers.Add()([input0, input1])

model = tf.keras.Model(inputs=[input0, input1], outputs=output)

predict_function = tf.function(
    func = lambda x, y: model.__call__([x,y]),
    input_signature=[input0.type_spec, input1.type_spec],
)

signatures = {
    "predict": predict_function.get_concrete_function(
        input0.type_spec, input1.type_spec,
    ),
}

model.save(
    filepath="some/dummy/path",
    signatures=signatures,
)

After loading, the model seems to work correctly:

print(model([[5], [6]]))
tf.Tensor(11, shape=(), dtype=int32)
  • Related