Tensorflow: exporting model for serving

Few days ago, I wrote about how to retrieving signature of an exported model from Tensorflow,  today i want to continue with how to export a model for serving. Particularly, exporting a model and serve it with TFServing. TFServing is a high performance tensorflow serving service written in C++. I am working on building a serving infrastructure, so i have to spend a lot of time on exporting tensorflow model and make it servable via TFServing.

The requirement for an exported model to be servable by TFServing is quite simple: you need to define inputs and outputs named signatures. The inputs signature will define the shape of the input tensor of the graph, and the outputs signature will define the output tensor of the prediction.

Exporting from a tensorflow graph
This is straight forward. If you build the graph yourself, you will have the inputs and outputs tensor. You will just need to create a Saver and an Exporter, then call with the right arguments.

Please see here for a complete example.

Exporting from a tf.contrib.learn Estimator
This is actually more tricky. Even the estimator provides export() API but the documentation is not helpful, and by default it won’t export a named signature so you can not use it directly. Instead, you will need to:

  • Define a input_fn to return the shape of the input. You can reuse your input_fn for data feeding if you have already did that during training.
  • Define a signature_fn as below
  • Make sure you pass input_feature_key and use_deprecated_input_fn=False when you call the export function.

Below is an example of exporting the classifier from this tutorial. Note: this is only for tensorflow 0.11. For 0.12 and 1.0 the api may be different.

Some explanation: in the input_fn you defined the features of your estimator, it will return a dict of tensors to represents your data. Usually this will return a tuple of features tensors and labels tensor, but for exporting you can skip the label tensor. You can refer to here for detail documentation. The above input_fn returns a feature tensor with feature name is empty string (“”). That’s why we also need to add input_feature_key="" to the export function.

Once the model is exported, you can just ship it to TF Serving and start serving it. I will continue with this series next few days on how to run the serving service and sending requests into it.

One thought on “Tensorflow: exporting model for serving

  1. Juan Nino

    Many thanks for your post,
    It has clarified many things to me. It seams that there is not a lot of clear information about this online. It might be because TFServing and TF in general are relatively recent. I am looking forward for more posts on this topic 🙂

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *