在分布式环境中为非阻塞 Node.js 应用程序执行长时间运行的作业的架构

发布于 2024-12-05 01:14:38 字数 1281 浏览 1 评论 0原文

我正在 node.js 中构建 HTTP 代理。当传入请求满足某些条件时,将执行长时间运行的作业。发生这种情况时,所有后续请求都必须等待作业结束(由于节点的架构):

function proxy(request, response) {
    if(isSpecial(request)) {
        // Long running job
    }
    // Proxy request
}

这不好。
所以我们可以说长时间运行的作业可以用Java,为此,我构建了一个 Java 服务器应用程序,每次节点应用程序发出请求时,该应用程序都会在单独的线程中执行长时间运行的作业。

因此,当条件满足时,node.js 会与 Java 服务器建立连接(TCP、HTTP 等)。 Java 服务器为请求初始化一个新线程,在这个单独的线程中执行长时间运行的作业,然后返回一个 JSON 响应(可以是二进制的,无论是什么),该节点可以轻松地异步处理

var javaServer = initJavaServer(); // pseudo-code

function proxy(request, response) {
    var special = isSpecial(request);
    if (special) {
        var jobResponse;
        javaServer.request( ... );
        javaServer.addListener("data", function(chunk)) {
            // Read response
            // jobResponse = ...
        }
        javaServer.addListener("end", function(jobResult)) {
            doProxy(jobResponse, request, response);
        }
    } else {
        doProxy(null, request, response);
    }
}

: ,我可以为那些满足条件的请求执行长时间运行的作业,而不会阻塞整个节点应用程序。

所以这里的要求是:

  1. 两个应用程序的速度
  2. 可扩展性(节点代理在一个集群上运行,Java 应用程序在另一个集群上运行)

也许像 RabbitMQ 这样的消息代理服务可能会有所帮助(节点推送消息,Java 订阅它们并推送响应)后退)。

想法?

I'm building an HTTP Proxy in node.js. When the incoming request meets some conditions, a long running job is executed. When this happens, all the subsequent requests must wait for the job to end (due to the node's architecture):

function proxy(request, response) {
    if(isSpecial(request)) {
        // Long running job
    }
    // Proxy request
}

This is not good.
So let's say that the long running job can be implemented in Java, and for this purpose I build a Java server application that executes the long running job in a separate thread every time a request is made by the node application.

So, when the conditions are met, node.js makes a connection (TCP, HTTP, whatever) to the Java server. The Java server initializes a new Thread for the request, executes the long running job in this separate thread, and returns back, let's say, a JSON response (can be binary, whatever) that node can easily, asynchronously, handle:

var javaServer = initJavaServer(); // pseudo-code

function proxy(request, response) {
    var special = isSpecial(request);
    if (special) {
        var jobResponse;
        javaServer.request( ... );
        javaServer.addListener("data", function(chunk)) {
            // Read response
            // jobResponse = ...
        }
        javaServer.addListener("end", function(jobResult)) {
            doProxy(jobResponse, request, response);
        }
    } else {
        doProxy(null, request, response);
    }
}

In this way, I can execute long running jobs for those request that meet the conditions, without blocking the whole node application.

So here the requirements are:

  1. Speed
  2. Scalability of both apps (the node proxy runs on a cluster, and the Java app on another one)

Maybe a messaging broker service like RabbitMQ may help (node pushes messages, Java subscribes to them and pushes the response back).

Thoughts?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

风铃鹿 2024-12-12 01:14:38

看看 Q-Oper8 ( https://github.com/robtweed/Q-Oper8 )旨在为此类情况提供原生 Node.js 解决方案

Take a look at Q-Oper8 ( https://github.com/robtweed/Q-Oper8 ) which is designed to provide a native Node.js solution to situations such as this

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文