如何将 *序列化 *张量传递给TensorFlow模型服务服务器?

发布于 2025-01-31 16:34:18 字数 1885 浏览 4 评论 0原文

我实施了一个简单的TF模型。该模型接收了灰色图像的序列化张量(仅仅是2D NDARRAY),并将其恢复为2-D张量。之后,对此2D张量进行了一些推断。

我使用TensorFlow模型服务部署了该模式,并尝试将JSON字符串发送到其余端口,如下所示:

  {
    "instances": [
                    {"b64": bin_str},
                 ]
  }

我尝试了类似tf.io.serialize_tensor等的东西,将输入映像转换为串行张量和转换为将其传递给服务器,但全部都失败了。

我想知道如何将串行张量发送到服务服务器。

而且我保存的模型具有以下签名:

 signatures = {
        "serving_default": _get_serve_tf_examples_fn(
            model,
            transform_output).get_concrete_function(
                # explicitly specify input signature of serve_tf_examples_fn
                tf.TensorSpec(
                    shape=[None],
                    dtype=tf.string,
                    name="examples")),
    }

以及_get_serve_tf_examples_fn的定义,

def _get_serve_tf_examples_fn(model: tf.keras.models.Model,
                              transform_output: tft.TFTransformOutput):
    # get the Transform graph from the component
    model.tft_layer = transform_output.transform_features_layer()

    @tf.function
    def serve_tf_examples_fn(serialized: str) -> Dict:
        '''  Args: serialized: is serialized image tensor.
        '''
        feature_spec = transform_output.raw_feature_spec()
        # remove label spec.
        feature_spec.pop("label")

        # Deserialize the image tensor.
        parsed_features = tf.io.parse_example(
            serialized,
            feature_spec)

        # Preprocess the example using outputs of Transform pipeline.
        transformed_features = model.tft_layer(parsed_features)
        outputs = model(transformed_features)
        return {"outputs": outputs}

    return serve_tf_examples_fn

上面的代码段接收到了灰色映像的串行张量(只有2D ndarray),并将其还原为2-D量量表。之后,该模型对此2-D张量进行推断。

我想知道如何将串行张量发送到服务服务器的其余端口。

任何帮助将不胜感激。

I implemented a simple TF model. The model received a serialized tensor of a gray image (simply a 2d ndarray), and restored it to a 2-d tensor. After then, some inference is applied on this 2-d tensor.

I deployed the mode with TensorFlow Model Serving, and tried to send a JSON string to the REST port as follows:

  {
    "instances": [
                    {"b64": bin_str},
                 ]
  }

I tried something like tf.io.serialize_tensor etc to convert input image into a serialized tensor and to pass it to the server, but all failed.

I would like to know how to send a serialized tensor to the serving server.

And my saved model has following signature:

 signatures = {
        "serving_default": _get_serve_tf_examples_fn(
            model,
            transform_output).get_concrete_function(
                # explicitly specify input signature of serve_tf_examples_fn
                tf.TensorSpec(
                    shape=[None],
                    dtype=tf.string,
                    name="examples")),
    }

and the definition of _get_serve_tf_examples_fn is,

def _get_serve_tf_examples_fn(model: tf.keras.models.Model,
                              transform_output: tft.TFTransformOutput):
    # get the Transform graph from the component
    model.tft_layer = transform_output.transform_features_layer()

    @tf.function
    def serve_tf_examples_fn(serialized: str) -> Dict:
        '''  Args: serialized: is serialized image tensor.
        '''
        feature_spec = transform_output.raw_feature_spec()
        # remove label spec.
        feature_spec.pop("label")

        # Deserialize the image tensor.
        parsed_features = tf.io.parse_example(
            serialized,
            feature_spec)

        # Preprocess the example using outputs of Transform pipeline.
        transformed_features = model.tft_layer(parsed_features)
        outputs = model(transformed_features)
        return {"outputs": outputs}

    return serve_tf_examples_fn

The above code segment received a serialized tensor of a gray image (simply a 2d ndarray), and restored it to a 2-d tensor. After then, the model is doing inference on this 2-d tensor.

I would like to know how to send a serialized tensor to the REST port of the serving server.

Any help would be appreciated.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文