Nodejs承诺会消耗太多的内存吗?
我正在尝试分析NODEJS处理异步功能的有效性。
我有下面的Nodejs脚本发起100万个承诺,这些诺言将睡觉2秒钟,以模拟密集的后端API呼叫。该脚本运行了一段时间(〜30s),最多可消耗4096 MB的RAM,并扔出JavaScript堆
错误。
- 承诺真的会消耗那么多记忆吗?
- 当Nodejs使用太多内存时,Nodejs如何对I/O密集型操作有益?
- Golang仅使用10MB的内存来处理1亿GO例程,在处理I/O密集操作时,Golang是否比Nodej还要好吗?
const sleep = async (ms) => new Promise((resolve) => setTimeout(resolve, ms));
const fakeAPICall = async (i) => {
await sleep(2000);
return i;
};
const NUM_OF_EXECUTIONS = 1e7;
console.time(`${NUM_OF_EXECUTIONS} executions:`);
[...Array(NUM_OF_EXECUTIONS).keys()].forEach((i) => {
fakeAPICall(i).then((r) => {
if (r === NUM_OF_EXECUTIONS - 1) {
console.timeEnd(`${NUM_OF_EXECUTIONS} executions:`);
}
});
});
错误
<--- Last few GCs --->
[41215:0x10281b000] 36071 ms: Mark-sweep (reduce) 4095.5 (4100.9) -> 4095.3 (4105.7) MB, 5864.0 / 0.0 ms (+ 1.3 ms in 2767 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 7190 ms) (average mu = 0.296, current mu = 0.[41215:0x10281b000] 44534 ms: Mark-sweep (reduce) 4096.3 (4104.7) -> 4096.3 (4105.7) MB, 8461.4 / 0.0 ms (average mu = 0.140, current mu = 0.000) allocation failure scavenge might not succeed
<--- JS stacktrace --->
FATAL ERROR: MarkCompactCollector: young object promotion failed Allocation failed - JavaScript heap out of memory
1: 0x100098870 node::Abort() [/usr/local/opt/node@14/bin/node]
2: 0x1000989eb node::OnFatalError(char const*, char const*) [/usr/local/opt/node@14/bin/node]
3: 0x1001a6d55 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/usr/local/opt/node@14/bin/node]
4: 0x1001a6cff v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/usr/local/opt/node@14/bin/node]
5: 0x1002dea5b v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/usr/local/opt/node@14/bin/node]
6: 0x100316819 v8::internal::EvacuateNewSpaceVisitor::Visit(v8::internal::HeapObject, int) [/usr/local/opt/node@14/bin/node]
I'm trying to analyze how effective the NodeJS is in handling async functions.
I have the NodeJS script below to initiate 10 millions of Promises which will sleep for 2 seconds to simulate an intensive backend API calls. The script run for a while (~30s), consumed up to 4096 MB of ram and threw JavaScript heap out of memory
error.
- Does the Promises really consume that much memory ?
- How come the NodeJS is supposed to be good for I/O intensive operations when it uses too much memory ?
- Golang only uses 10MB of memory to handle 100 millions of Go Routines, is Golang even better than NodeJS in handling I/O intensive operations ?
const sleep = async (ms) => new Promise((resolve) => setTimeout(resolve, ms));
const fakeAPICall = async (i) => {
await sleep(2000);
return i;
};
const NUM_OF_EXECUTIONS = 1e7;
console.time(`${NUM_OF_EXECUTIONS} executions:`);
[...Array(NUM_OF_EXECUTIONS).keys()].forEach((i) => {
fakeAPICall(i).then((r) => {
if (r === NUM_OF_EXECUTIONS - 1) {
console.timeEnd(`${NUM_OF_EXECUTIONS} executions:`);
}
});
});
ERROR
<--- Last few GCs --->
[41215:0x10281b000] 36071 ms: Mark-sweep (reduce) 4095.5 (4100.9) -> 4095.3 (4105.7) MB, 5864.0 / 0.0 ms (+ 1.3 ms in 2767 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 7190 ms) (average mu = 0.296, current mu = 0.[41215:0x10281b000] 44534 ms: Mark-sweep (reduce) 4096.3 (4104.7) -> 4096.3 (4105.7) MB, 8461.4 / 0.0 ms (average mu = 0.140, current mu = 0.000) allocation failure scavenge might not succeed
<--- JS stacktrace --->
FATAL ERROR: MarkCompactCollector: young object promotion failed Allocation failed - JavaScript heap out of memory
1: 0x100098870 node::Abort() [/usr/local/opt/node@14/bin/node]
2: 0x1000989eb node::OnFatalError(char const*, char const*) [/usr/local/opt/node@14/bin/node]
3: 0x1001a6d55 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/usr/local/opt/node@14/bin/node]
4: 0x1001a6cff v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/usr/local/opt/node@14/bin/node]
5: 0x1002dea5b v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/usr/local/opt/node@14/bin/node]
6: 0x100316819 v8::internal::EvacuateNewSpaceVisitor::Visit(v8::internal::HeapObject, int) [/usr/local/opt/node@14/bin/node]
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
nodejs具有默认的内存限制,可以使用
-max_old_space_size =&lt; MB&gt;
node选项中的内存更改;甚至没有接近。其中大约有5000万。
在每次迭代中,您实际上创建了至少5个承诺和一个发电机,因此您有5000万个承诺和大量其他对象。这是很多,因为它们是用JS编写的纯JS对象,当然,它们比低级预编译语言所消耗的记忆更多。节点与低内存消耗无关,而是在您的情况下成为瓶颈。
如果您需要内存优化,则可以易于使用 - 纯回调可能会更便宜。
在这里,我们创建1000万承诺:
内存(2.6 GB):
Nodejs has a default memory limit which can be changed with the
--max_old_space_size=<memory in MB>
NODE option;Not even close. There are about 50 million of them.
In each iteration, you actually create at least 5 promises and one generator, so you have 50 million promises and a huge amount of other objects in memory. This is a lot since they are pure JS objects written in JS and of course, they consume more memory than low-level precompiled languages. Node is not about low memory consumption, but memory becomes the bottleneck in your case.
Promises are made for ease of use, if you need memory optimization - pure callbacks can be cheaper.
Here we create 10M promises:
Memory (2.6 GB):