将 stdio 流式传输到网站(例如 buildbot)

发布于 2024-12-01 04:51:27 字数 171 浏览 1 评论 0原文

因此,我尝试在类似于 Buildbot 的网络应用程序的 stdio 流中工作。有谁知道Buildbot如何处理stdio?它是流式传输的(看起来是这样),这正是我这个程序所需要的。

有谁知道该怎么做?

任何帮助将不胜感激。

顺便说一下,我正在使用 Python 和 Django

So I am trying to work in a stdio stream for a webapp similar to Buildbot. Does anyone know how Buildbot deals with stdio? It is streaming (so it seems) and that would be exactly what I need for this program.

Does anyone know how to do this?

Any help would be greatly appreciated.

I am using Python with Django by the way

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

執念 2024-12-08 04:51:27

Buildbot 使用 Twisted,它的编程风格与 Django 相当不同。

我想说你必须在 Twisted 中编写一个服务才能完成你想要的事情。从 Django 与它通信并让它完成您需要的流处理部分。

(在 Django 中,与大多数 Web 应用程序一样,每个请求都有一个线程/调用,该线程/调用会阻塞 db/io 调用。在 Twisted 中,您可以推迟对库的异步函数调用,并在 io 完成后继续回调。一开始感觉有点奇怪,但实际上非常好,特别是对于不仅仅是网络服务器的应用程序。)

Buildbot uses Twisted, which has a rather different programming style from Django.

I'd say you'd have to write a service in Twisted to do what you're after. Communicate with it from Django and have it do the streaming part you need.

(In Django, as most web apps, you have one thread/call per request that blocks on db/io calls. In Twisted, you defer async function calls to the library and have callbacks to continue once the io is done. It feel a bit wierd at first, but it's actually quite nice, especially for apps that aren't just web-servers.)

薄凉少年不暖心 2024-12-08 04:51:27

Django 允许您从视图返回迭代器。您可以返回一个迭代器,该迭代器增量读取流并将其返回给 django。
request-response/#passing-iterators
前。 这是最简单的

解决方案

from django.core.servers.basehttp import FileWrapper
from subprocess import Popen, PIPE
from django.contrib.auth.decorators import user_passes_test

@user_passes_test(lambda u: u.is_superuser) #SECURE THIS SHIT
def stdout_cmd(request, command):
              #buffsize=1 reads a char at a time
    process = Popen(command, shell=True, buffsize='1', stdout=PIPE)
              # FileWrapper creates an iterator class for stdout
    wrapper = FileWrapper(process.stdout)
              # text/plain causes monospaced output in browser
    return HttpResponse(wrapper, content_type='text/plain')

只要您可以创建一个类似文件的对象,您就可以将其包装在 FileWrapper 中。
请注意,如果 shell 的输出是零星的,您将需要创建自己的迭代器来一次读取一行,例如

wrapper = (line for line in process.stdout) # brackets create an iterator object

注意:如果您有太多并发长轮询,这将消耗工作线程并阻止人们连接到您的服务器页面加载。考虑使用 gevent/greenlets。适用于高流量环境。

Django allows you to return iterators from views. You could return an iterator that incrementally reads the stream and yields it back to django.
request-response/#passing-iterators
ex. snippet

Here is the simplest possible solution

from django.core.servers.basehttp import FileWrapper
from subprocess import Popen, PIPE
from django.contrib.auth.decorators import user_passes_test

@user_passes_test(lambda u: u.is_superuser) #SECURE THIS SHIT
def stdout_cmd(request, command):
              #buffsize=1 reads a char at a time
    process = Popen(command, shell=True, buffsize='1', stdout=PIPE)
              # FileWrapper creates an iterator class for stdout
    wrapper = FileWrapper(process.stdout)
              # text/plain causes monospaced output in browser
    return HttpResponse(wrapper, content_type='text/plain')

As long as you can create a file-like object you can wrap it in FileWrapper.
Be aware that if output of you shell is sporadic, you will want to create your own iterator that reads a line at a time e.g.

wrapper = (line for line in process.stdout) # brackets create an iterator object

NOTE: This will consume worker threads and block people connecting to your server if you have too many concurrent long-poll pages loading. considder using gevent/greenlets. for high-traffic environments.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文