流 API 与 Rest API?

发布于 2024-11-06 05:11:35 字数 343 浏览 1 评论 0原文

这里典型的例子是 Twitter 的 API。我从概念上理解 REST API 是如何工作的,本质上它只是针对您的特定请求向服务器进行查询,然后您会收到响应(JSON、XML 等),这很好。

但是我不太确定流 API 在幕后是如何工作的。我明白如何食用它。例如,使用 Twitter 监听响应。从响应中监听数据以及其中的推文以块的形式出现。在字符串缓冲区中构建块并等待表示推文结束的换行符。但他们正在做什么来让这项工作成功呢?

假设我有一堆数据,我想在本地设置一个流 API 供网络上的其他人使用(就像 Twitter 一样)。这是如何做到的,采用什么技术?这是 Node JS 可以处理的吗?我只是想了解他们正在做什么以使这件事发挥作用。

The canonical example here is Twitter's API. I understand conceptually how the REST API works, essentially its just a query to their server for your particular request in which you then receive a response (JSON, XML, etc), great.

However I'm not exactly sure how a streaming API works behind the scenes. I understand how to consume it. For example with Twitter listen for a response. From the response listen for data and in which the tweets come in chunks. Build up the chunks in a string buffer and wait for a line feed which signifies end of Tweet. But what are they doing to make this work?

Let's say I had a bunch of data and I wanted to setup a streaming API locally for other people on the net to consume (just like Twitter). How is this done, what technologies? Is this something Node JS could handle? I'm just trying to wrap my head around what they are doing to make this thing work.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

心房敞 2024-11-13 05:11:35

Twitter 的流 API 本质上是一个长期运行的请求,保持开放状态,数据在可用时被推送到其中。

其影响是服务器必须能够处理大量并发打开的 HTTP 连接(每个客户端一个)。许多现有服务器管理得不太好,例如 Java Servlet 引擎为每个请求分配一个线程,这可能 (a) 变得相当昂贵,并且 (b) 很快达到正常的最大线程设置并阻止后续连接。

正如您所猜测的,Node.js 模型比 servlet 模型更适合流连接的想法。请求和响应在 Node.js 中都以流的形式公开,但不占用整个线程或进程,这意味着只要流保持打开状态,您就可以继续将数据推送到流中,而不会占用过多的资源(尽管这是主观的)。理论上,您可以将大量并发开放响应连接到单个进程,并且仅在必要时才写入每个响应。

如果您还没有查看过 Node.js 的 HTTP 文档。 js 可能有用。

我还会查看 technoweenie 的 Twitter 客户端,看看该 API 的消费者端是什么看起来像 Node.js,流特别是 () 函数

Twitter's stream API is that it's essentially a long-running request that's left open, data is pushed into it as and when it becomes available.

The repercussion of that is that the server will have to be able to deal with lots of concurrent open HTTP connections (one per client). A lot of existing servers don't manage that well, for example Java servlet engines assign one Thread per request which can (a) get quite expensive and (b) quickly hits the normal max-threads setting and prevents subsequent connections.

As you guessed the Node.js model fits the idea of a streaming connection much better than say a servlet model does. Both requests and responses are exposed as streams in Node.js, but don't occupy an entire thread or process, which means that you could continue pushing data into the stream for as long as it remained open without tying up excessive resources (although this is subjective). In theory you could have a lot of concurrent open responses connected to a single process and only write to each one when necessary.

If you haven't looked at it already the HTTP docs for Node.js might be useful.

I'd also take a look at technoweenie's Twitter client to see what the consumer end of that API looks like with Node.js, the stream() function in particular.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文