带有TensorFlow容器的SageMaker端点忽略the pypy.py文件
我使用的是一个张紧模型,该模型被保存如下:
tf-model
00000123
assets
variables
variables.data-00000-of-00001
variables.index
keras_metadata.pb
saved_model.pb
TF模型正在拾取,并且它可以作为端点工作,当我将SageMaker内部的预测对象与端点联系起来并运行它时,它会返回我的期望。
但是,我想用帖子JSON推理它,并且我想要一个帖子JSON,与Sklearn,XGB或Pytorch端点相同。
我试图在推理上实现这一点。
我几乎完全按照本页末尾给出的脚本使用: noreflow noreferrer“> https://sagemaker.readthedocs.io/en/stable/frameworks/tensorflow/tensorflow/using_tf.html
我已经尝试了两个版本(input/utput/output handler + Just just anders),我尝试过将其留在SageMaker中环境,我已经尝试使用tar.gz文件将其包装好,我将其放在S3存储桶中,并设置指针和环境变量(通过TensorflowModel Kwarg),但是无论我尝试什么,我都只会忽略它推理。
我知道它会忽略它,因为我对输入处理程序的应用程序/JSON部分进行了次要编辑,并且这些编辑都没有显示,即使我更改它,因此它仅采用文本/CSV或其他更改不反映的东西。
我在做什么错?如何获得TensorFlow无服务器环境以返回输出的帖子,而不是按照其默认行为将输出保存到S3?
I'm using a tensorflow model which is saved as follows:
tf-model
00000123
assets
variables
variables.data-00000-of-00001
variables.index
keras_metadata.pb
saved_model.pb
The tf model is getting picked up and it's working as an endpoint, and when I associate a predictor object inside of sagemaker with the endpoint and run it it returns what I expect.
However, I want to inference it with a POST json and I want a POST json back, the same as with sklearn or xgb or pytorch endpoints.
I tried to implement this on the inference.py which I'm passing as an entry point, but no matter what I try, the endpoint just seems to ignore the inference.py script.
I use the script almost exactly as is given at the end of this page: https://sagemaker.readthedocs.io/en/stable/frameworks/tensorflow/using_tf.html
I've tried both versions (input / output handler + just handler), I've tried leaving it in the sagemaker environment, I've tried packaging it up with the tar.gz file, I've put it in an s3 bucket and set up pointers and environment variables (via tensorflowmodel kwarg) to it, but no matter what I try it just ignores the inference.py.
I know it ignores it because I have made minor edits on the application/json part of the input handler and none of these edits show up, even when I change it so it only takes text/csv or something else the changes do not reflect.
What am I doing wrong? How can I get the tensorflow serverless environment to return a POST of the output instead of saving the output to s3 as per it's default behaviour?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您的未划分的模型工件应该看起来像这样:
(infrence.py位于代码/文件夹中)
请参阅此“ nofollow noreferrer”>链接有关更多信息
,默认情况下,
application/json
支持了SageMaker TensorFlow服务容器的请求和响应。因此,您可以发送JSON(带有正确的张量形状您的模型所期望的)并接收JSON响应。
此问题/答案使用Postman发布请求。
Your untared model artifacts should look something like this:
(where infrence.py is located in the code/ folder)
Kindly see this link for more information
That being said, by default
application/json
is supported for requests and responses for the SageMaker TensorFlow Serving container.Thus, you can send a JSON in (with the correct tensor shape your model expects) and receive JSON response.
This question/answer here explains hows to make the Post requests using Postman.