在 FastAPI 中渲染 NumPy 数组
我发现 如何使用 FastAPI 将 numpy 数组作为图像返回? 然而,我仍然在努力显示图像,它看起来只是一个白色方块。
我将一个数组读入 io.BytesIO ,如下所示:
def iterarray(array):
output = io.BytesIO()
np.savez(output, array)
yield output.get_value()
在我的端点中,我的返回是 StreamingResponse(iterarray(), media_type='application/octet-stream')
当我将 media_type
留空以推断下载了 zip 文件。
如何将数组显示为图像?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

- 选项 1 - 以字节形式返回图像
- 使用 PIL
- 服务器端:
- 客户端:
- 使用 OpenCV
- 服务器端:
- 客户端:
- 有用信息
- 选项 2 - 将图像返回为 JSON 编码的 numpy 数组
- 使用 PIL
- 服务器端:
- 客户端:
- 使用 OpenCV
- 服务器端:
- 客户端:
- Option 1 - Return image as bytes
- Using PIL
- Server side:
- Client side:
- Using OpenCV
- Server side:
- Client side:
- Useful Information
- Option 2 - Return image as JSON-encoded numpy array
- Using PIL
- Server side:
- Client side:
- Using OpenCV
- Server side:
- Client side:
选项 1 - 以字节形式返回图像
下面的示例展示了如何将从磁盘加载的图像或内存中的图像(以 numpy 数组的形式)转换为字节(使用
PIL
或 < code>OpenCV 库)并使用自定义直接响应
。出于本演示的目的,以下代码用于创建内存中示例图像(numpy 数组),该图像基于此回答。使用 PIL
服务器端:
您可以使用
Image.open
从磁盘加载图像,或使用Image.fromarray
加载内存中的图像(注意:出于演示目的,当该案例正在从磁盘加载图像,如下演示了路由内的操作。但是,如果要多次提供同一个图像,则只能在startup
和 将其存储到app
实例,如"="">这个答案和这个答案)。接下来,将图像写入缓冲流,即BytesIO
,并使用getvalue()
方法获取缓冲区的全部内容。尽管缓冲流在超出范围时会被垃圾收集,但通常更好的是调用close()
或使用with
语句,如此处和下面的示例所示。客户端:
下面演示如何使用Python requests模块向上述端点发送请求,并将接收到的字节写入文件,或将字节转换回PIL
Image
,如所述="https://stackoverflow.com/a/32908899/17865804">这里。使用 OpenCV
服务器端:
您可以使用 < 从磁盘加载图像code>cv2.imread() 函数,或者使用内存中的图像,如果它是
RGB
顺序,如下例所示,则需要转换,如 OpenCV 使用BGR
作为图像的默认颜色顺序。接下来,使用cv2.imencode()
函数,它压缩图像数据(基于您传递的文件扩展名)定义输出格式,即.png
、.jpg
等)并将其存储在内存缓冲区中,该缓冲区用于通过网络传输数据。客户端:
在客户端,您可以将原始字节写入文件,或使用 "="">
numpy.frombuffer()
函数和cv2.imdecode()
函数将缓冲区解压缩为图像格式(类似于this) -cv2.imdecode()
不需要文件扩展名,因为正确的编解码器将是从缓冲区中压缩图像的第一个字节推导出来。有用信息
既然您注意到您希望显示的图像类似于
FileResponse
,使用自定义Response
返回字节应该是执行此操作的方法,而不是使用StreamingResponse
(如您的问题所示)。要指示应在浏览器中查看图像,HTTP
响应应包含以下Content-Disposition
标头,如所述 此处 并如上面的示例所示(如果文件名
两边的引号是必需的, >文件名包含特殊字符):然而,要下载图像而不是查看图像(使用
附件
而不是inline
):如果您想使用 JavaScript 接口(例如 Fetch API 或 Axios)显示(或下载)图像,请查看答案 此处和此处。
对于
StreamingResponse
,如果整个 numpy 数组/图像已加载到内存中,根本不需要StreamingResponse
(当然,不应该是将已加载到内存中的数据返回到这 客户)。StreamingResponse
通过迭代iter()
函数提供的块来进行流式传输。如StreamingResponse
的实现所示class,如果您传递的迭代器/生成器不是AsyncIterable
,来自外部线程池的线程 - 有关该线程池的更多详细信息,请参阅此答案 -将使用 Starlette 的 ="">同步迭代器href="https://github.com/encode/starlette/blob/827de43b97ad58c7b8daae12ad17c1848a4f1341/starlette/concurrency.py#L54" rel="nofollow noreferrer">iterate_in_threadpool()
函数,以避免阻塞事件循环。还应该注意的是,使用StreamingResponse
时未设置Content-Length
响应标头(这是有道理的,因为StreamingResponse< /code> 应该在您事先不知道响应大小的情况下使用),与 FastAPI/Starlette 的其他
Response
类不同,它们为您设置该标头,以便浏览器知道数据结束的地方。应保持这种方式,就好像包含Content-Length
标头(其值应与整体响应正文大小(以字节为单位)匹配),然后发送到服务器StreamingResponse
看起来与Response
相同,因为在这种情况下服务器不会使用transfer-encoding: chunked
(即使在应用程序级别两者仍然不同) -看看Uvicorn 有关响应标头的文档 和 MDN 有关Transfer-Encoding 的文档:分块
以获取更多详细信息。即使您事先知道主体大小,但仍然需要使用 StreamingResponse,因为它允许您通过指定您选择的块大小来加载和传输数据,这与 FileResponse不同。 /code> (请参阅稍后的详细信息),您应该确保不自行设置
Content-Length
标头,例如StreamingResponse(iterfile() , headers={'内容长度': str(content_length)})
,因为这会导致服务器不使用transfer-encoding: chunked
(无论应用程序将数据传递到网络)服务器以块的形式,如 相关实现)。如此答案中所述:
即使您想流式传输保存在磁盘上的图像文件,类文件对象,例如由
open()
创建的对象,是普通迭代器;因此,您可以直接在StreamingResponse
中返回它们,如 文档 并如下所示(如果您发现yield from f
相当慢,当使用StreamingResponse
,请查看这个答案,了解如何使用以下命令分块读取文件您选择的块大小 - 应根据您的需求以及服务器的资源进行设置)。应该注意的是,使用 FileResponse 也会将文件内容以块的形式读入内存,而不是一次读取整个内容。但是,如 FileResponse 的实现所示 class,使用的块大小是预先定义的并设置为64KB。因此,根据需求,他们应该决定使用两个 Response 类中的哪一个。或者,如果图像已经加载到内存中,然后保存到
BytesIO
缓冲流,因为BytesIO
是一个 "="">类文件对象(就像io 模块),您可以直接在StreamingResponse
中返回它(或者,最好简单地调用buf.getvalue()
来获取整个图像字节并使用自定义返回它们直接响应
,如前所示)。如果返回缓冲流,如下例所示,请记住调用buf.seek(0)
,以便将光标倒回到缓冲区的开头,并调用close()
在后台任务中,以便在响应发送到客户端后丢弃缓冲区。因此,在您的情况下,最适合的方法是返回自定义
直接响应
,包括您的自定义内容
和media_type
,以及如前所述,设置Content-Disposition
标头,以便在浏览器中查看图像。选项 2 - 将图像返回为 JSON 编码的 numpy 数组
下面不应该用于在浏览器中显示图像,但为了完整起见,在此处添加了它,展示了如何将图像转换为 numpy 数组(最好是, 使用
asarray()
函数),然后 以JSON格式返回数据,最后在客户端将数据转换回图像,如这个和这个答案。如需标准 Pythonjson
库的更快替代方案,请参阅此答案。使用 PIL
服务器端:
客户端:
使用 OpenCV
服务器端:
客户端:
Option 1 - Return image as bytes
The below examples show how to convert an image loaded from disk, or an in-memory image (in the form of numpy array), into bytes (using either
PIL
orOpenCV
libraries) and return them using a customResponse
directly. For the purposes of this demo, the below code is used to create the in-memory sample image (numpy array), which is based on this answer.Using PIL
Server side:
You can load an image from disk using
Image.open
, or useImage.fromarray
to load an in-memory image (Note: For demo purposes, when the case is loading the image from disk, the below demonstrates that operation inside the route. However, if the same image is going to be served multiple times, one could load the image only once atstartup
and store it to theapp
instance, as described in this answer and this answer). Next, write the image to a buffered stream, i.e.,BytesIO
, and use thegetvalue()
method to get the entire contents of the buffer. Even though the buffered stream is garbage collected when goes out of scope, it is generally better to callclose()
or use thewith
statement, as shown here and in the example below.Client side:
The below demonstrates how to send a request to the above endpoint using Python requests module, and write the received bytes to a file, or convert the bytes back into PIL
Image
, as described here.Using OpenCV
Server side:
You can load an image from disk using
cv2.imread()
function, or use an in-memory image, which—if it is inRGB
order, as in the example below—needs to be converted, as OpenCV usesBGR
as its default colour order for images. Next, usecv2.imencode()
function, which compresses the image data (based on the file extension you pass that defines the output format, i.e.,.png
,.jpg
, etc.) and stores it in an in-memory buffer that is used to transfer the data over the network.Client side:
On client side, you can write the raw bytes to a file, or use the
numpy.frombuffer()
function andcv2.imdecode()
function to decompress the buffer into an image format (similar to this)—cv2.imdecode()
does not require a file extension, as the correct codec will be deduced from the first bytes of the compressed image in the buffer.Useful Information
Since you noted that you would like the image displayed similar to a
FileResponse
, using a customResponse
to return the bytes should be the way to do this, instead of usingStreamingResponse
(as shown in your question). To indicate that the image should be viewed in the browser, theHTTP
response should include the followingContent-Disposition
header, as described here and as shown in the above examples (the quotes around thefilename
are required, if thefilename
contains special characters):Whereas, to have the image downloaded rather than viewed (use
attachment
instead ofinline
):If you would like to display (or download) the image using a JavaScript interface, such as Fetch API or Axios, have a look at the answers here and here.
As for
StreamingResponse
, if the entire numpy array/image is already loaded into memory,StreamingResponse
would not be necessary at all (and certainly, should not be the preferred choice for returning data that is already loaded in memory to the client).StreamingResponse
streams by iterating over the chunks provided by youriter()
function. As shown in the implementation ofStreamingResponse
class, if the iterator/generator you passed is not anAsyncIterable
, a thread from the external threadpool—see this answer for more details on that threadpool—will be spawned to run the synchronous iterator you passed, using Starlette'siterate_in_threadpool()
function, in order to avoid blocking the event loop. It should also be noted that theContent-Length
response header is not set when usingStreamingResponse
(which makes sense, sinceStreamingResponse
is supposed to be used when you don't know the size of the response beforehand), unlike otherResponse
classes of FastAPI/Starlette that set that header for you, so that the browser will know where the data ends. It should be kept that way, as if theContent-Length
header is included (of which its value should match the overall response body size in bytes), then to the serverStreamingResponse
would look the same asResponse
, as the server would not usetransfer-encoding: chunked
in that case (even though at the application level the two would still differ)—take a look at Uvicorn's documentation on response headers and MDN'S documentation onTransfer-Encoding: chunked
for further details. Even in cases where you know the body size beforehand, but would still need usingStreamingResponse
, as it would allow you to load and transfer data by specifying the chunk size of your choice, unlikeFileResponse
(see later on for more details), you should ensure not setting theContent-Length
header on your own, e.g.,StreamingResponse(iterfile(), headers={'Content-Length': str(content_length)})
, as this would result in the server not usingtransfer-encoding: chunked
(regardless of the application delivering the data to the web server in chunks, as shown in the relevant implementation).As described in this answer:
Even if you would like to stream an image file that is saved on the disk, file-like objects, such as those created by
open()
, are normal iterators; thus, you could return them directly in aStreamingResponse
, as described in the documentation and as shown below (if you findyield from f
being rather slow, when usingStreamingResponse
, please have a look at this answer on how to read the file in chunks with the chunk size of your choice—which should be set based on your needs, as well as your server's resources). It should be noted that usingFileResponse
would also read the file contents into memory in chunks, instead of the entire contents at once. However, as can be seen in the implementation ofFileResponse
class, the chunk size used is pre-defined and set to 64KB. Thus, based on one's requirements, they should decide on which of the twoResponse
classes they should use.Or, if the image was already loaded into memory instead, and then saved into a
BytesIO
buffered stream, sinceBytesIO
is a file-like object (like all the concrete classes of io module), you could return it directly in aStreamingResponse
(or, preferably, simply callbuf.getvalue()
to get the entire image bytes and return them using a customResponse
directly, as shown earlier). In case of returning the buffered stream, as shown in the example below, please remember to callbuf.seek(0)
, in order to rewind the cursor to the start of the buffer, as well as callclose()
inside a background task, in order to discard the buffer, once the response has been sent to the client.Thus, in your case scenario, the most suited approach would be to return a custom
Response
directly, including your customcontent
andmedia_type
, as well as setting theContent-Disposition
header, as described earlier, so that the image is viewed in the browser.Option 2 - Return image as JSON-encoded numpy array
The below should not be used for displaying the image in the browser, but it is rather added here for the sake of completeness, showing how to convert an image into a numpy array (preferably, using
asarray()
function), then return the data in JSON format, and finally, convert the data back to image on client side, as described in this and this answer. For faster alternatives to the standard Pythonjson
library, see this answer.Using PIL
Server side:
Client side:
Using OpenCV
Server side:
Client side: