Home > front end >  Tensorflow v2.10 mutate output of signature function to be a map of label to results
Tensorflow v2.10 mutate output of signature function to be a map of label to results

Time:11-10

I'm trying to save my model so that when called from tf-serving the output is:

{
   "results": [
      { "label1": x.xxxxx, "label2": x.xxxxx },
      { "label1": x.xxxxx, "label2": x.xxxxx }
   ]
}

where label1 and label2 are my labels and x.xxxxx are the probability of that label.

This is what I'm trying:

class TFModel(tf.Module):

    def __init__(self, model: tf.keras.Model) -> None:
        self.labels = ['label1', 'label2']
        self.model = model
            
    @tf.function(input_signature=[tf.TensorSpec(shape=(1, ), dtype=tf.string)])
    def prediction(self, pagetext: str):

        return
        { 'results': tf.constant([{k: v for dct in [{self.labels[c]: f"{x:.5f}"} for (c,x) in enumerate(results[i])] for k, v in dct.items()}
         for i in range(len(results.numpy()))])}


# and then save it:
tf_model_wrapper = TFModel(classifier_model)
tf.saved_model.save(tf_model_wrapper.model, 
                      saved_model_path,
                      signatures={'serving_default':tf_model_wrapper.prediction}
                   )

Side Note: Apparently in TensorFlow v2.0 if signatures is omitted it should scan the object for the first @tf.function (according to this: https://www.tensorflow.org/api_docs/python/tf/saved_model/save) but in reality that doesn't seem to work. Instead, the model saves successfully with no errors and the @tf.function is not called, but default output is returned instead.

The error I get from the above is:

ValueError: Got a non-Tensor value <tf.Operation 'PartitionedCall' type=PartitionedCall> for key 'output_0' in the output of the function __inference_prediction_125493 used to generate the SavedModel signature 'serving_default'. Outputs for functions used as signatures must be a single Tensor, a sequence of Tensors, or a dictionary from string to Tensor.

I wrapped the result in tf.constant above because of this error, thinking it might be a quick fix, but I think it's me just being naive and not understanding Tensors properly.

I tried a bunch of other things before learning that [all outputs must be return values].1

How can I change the output to be as I want it to be?

CodePudding user response:

You can see a Tensor as a multidimensional vector, i.e a structure with a fixed size and dimension and containing elements sharing the same type. Your return value is a map between a string and a list of dictionaries. A list of dictionaries cannot be converted to a tensor, because there is no guarantee that the number of dimensions and their size is constant, nor a guarantee that each element is sharing the same type.

You could instead return the raw output of your network, which should be a tensor and do your post processing outside of tensorflow-serving.


If you really want to do something like in your question, you can use a Tensor of strings instead, and you could use some code like that:

labels = tf.constant(['label1', 'label2'])
# if your batch size is dynamic, you can use tf.shape on your results variable to find it at runtime
batch_size = 32
# assuming your model returns something with the shape (N,2)
results = tf.random.uniform((batch_size,2))
res_as_str = tf.strings.as_string(results, precision=5)
return {
    "results": tf.stack(
        [tf.tile(labels[None, :], [batch_size, 1]), res_as_str], axis=-1
    )
} 

The output will be a dictionary mapping the value "results" to a Tensor of dimensions (Batch, number of labels, 2), the last dimension containing the label name and its corresponding value.

  • Related