为什么这个 javascript 在 Node.js 中会被阻塞?
我使用 Node.js 有以下简单的 http 服务器:
var http = require('http');
var server = http.createServer(function(req, res) {
var counter = 0;
for(var i = 1; i <= 30; i++) {
http.get({ host: "www.google.com" }, function(r) {
counter++;
res.write("Response " + counter + ": " + r.statusCode + "\n");
if(counter == 30) res.end();
});
}
});
server.listen(8000);
当我在端口 8000 上卷入本地主机时,我确实得到了预期的结果:
Response 1: 200
Response 2: 200
Response 3: 200
...
Response 30: 200
但是当我在第一个进程运行时尝试从另一个终端卷入时,我看到控制台挂起并等待第一个进程完全完成,然后才开始接收相同的输出。
我的理解是,由于这是使用回调的异步代码,因此节点可以通过在事件循环的下一个刻度上处理它们来同步处理多个请求。事实上,我什至观看了 Ryan Dahl 的视频,他用 hello world 示例做了类似的事情。我的代码中的什么内容导致服务器阻塞?
I have the following simple http server using Node.js:
var http = require('http');
var server = http.createServer(function(req, res) {
var counter = 0;
for(var i = 1; i <= 30; i++) {
http.get({ host: "www.google.com" }, function(r) {
counter++;
res.write("Response " + counter + ": " + r.statusCode + "\n");
if(counter == 30) res.end();
});
}
});
server.listen(8000);
When I curl into my local host on port 8000, I do get the expected result of:
Response 1: 200
Response 2: 200
Response 3: 200
...
Response 30: 200
But when I try to curl in from another terminal while the first process is running, I see the console hang and wait for the first process to finish entirely before it starts receiving the same output.
My understanding was that since this is async code using callbacks that node could handle multiple requests in sync by processing them on the next tick of the event loop. And in fact I even watched a video of Ryan Dahl doing something similar with a hello world example. What's in my code that's making the server block?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
您的问题与阻止呼叫无关;这与您一次只能打开到单个主机的一定数量的连接这一事实有关。一旦达到最大打开连接数,对
http.get
的其他异步调用必须等待,直到打开连接数再次下降,这种情况发生在其他请求完成且其回调被触发时。由于您创建新请求的速度比它们耗尽的速度快,因此您会得到看似阻塞的结果。这是我为了测试这个而创建的程序的修改版本。 (请注意,有一种更简单的方法可以解决您的问题,如 mtomis 所示 - 下面将详细介绍。)我添加了一些
console.log
日志记录,因此可以更轻松地判断事物的顺序正在处理中;我还拒绝除/
之外的所有请求,因此favicon.ico
请求将被忽略。最后,我向许多不同的网站提出请求。我运行了这个程序,并很快在两个不同的浏览器中访问了该页面(我还刷新了我的 DNS 缓存,因为测试运行得太快,否则无法获得良好的输出)。下面是输出:
如您所见,除了我按下
Alt+Tab Enter
所花费的时间之外,回调完全混合在一起——异步、非阻塞 I/O最好的。[编辑]
正如 mtomis 提到的,每个主机可以打开的最大连接数可以通过全局
http.globalAgent.maxSockets
进行配置。只需将其设置为您希望每个主机能够处理的并发连接数,您观察到的问题就会消失。Your issue doesn't have anything to do with blocking calls; this has to do with the fact that you are only able to open a certain number of connections at a time to a single host. Once you hit the maximum number of open connections, the other asynchronous calls to
http.get
have to wait until the number of open connections falls again, which happens when the other requests are complete and their callbacks are fired. Since you're creating new requests faster than they drain, you get your seemingly blocking results.Here is a modified version of your program I created to test this. (Note that there is an easier way to solve your problem, as indicated by mtomis--more on this below.) I added some
console.log
logging, so it is easier to tell what order things were being processed in; I also reject all requests for anything other than/
, so thatfavicon.ico
requests are ignored. Finally, I make requests to many various websites.I ran this program and very quickly visited the page in two different browsers (I also flushed my DNS cache, as the test was running too quickly to get good output otherwise). Here is the output:
As you can see, other than the period of time it took me to hit
Alt+Tab Enter
, the callbacks are completely intermingled--asynchronous, non-blocking I/O at its finest.[Edit]
As mtomis mentioned, the number of maximum connections you can have open per host is configurable via the global
http.globalAgent.maxSockets
. Simply set this to the number of concurrent connections you want to be able to handle per host, and the issue you observed disappears.Node.js 对每台主机的客户端连接有限制(默认情况下每台主机 5 个连接),如下所述:http://nodejs.org/docs/v0.5.4/api/http.html#agent.maxSockets
第二个curl进程挂起直到第一个进程完成的原因是因为第一个进程排队30 个请求,其中 5 个可以同时处理,因此在第一个进程完成之前,无法处理第二个进程的接下来的 30 个请求。在你的
例如,如果您设置
http.globalAgent.maxSockets = 60;
那么调用将被并发处理。Node.js has a limit for client connections per host (by default 5 connections per host), as documented here: http://nodejs.org/docs/v0.5.4/api/http.html#agent.maxSockets
The reason your second curl process hangs until the first one is finished is because the first process queues 30 requests, 5 of which can be handled at the same time therefore the next 30 requests of the second process can not be handled until the first ones are complete. In your
example if you set
http.globalAgent.maxSockets = 60;
then the calls will be handled concurrently.好吧,我认为你实际上并没有将请求生成为可以回调的东西。您只有一个事件处理程序,并且它连续运行一个循环。
你能找到瑞安·达尔 (Ryan Dahl) 在哪里发表演讲吗?
Well you're not actually spawning off the requests I think, into something that can be called back. You've only got one event handler and it's running a loop all in a row.
Can you find where Ryan Dahl gave that talk?