Nodejs承诺会消耗太多的内存吗?

发布于 2025-02-06 16:15:57 字数 2092 浏览 2 评论 0原文

我正在尝试分析NODEJS处理异步功能的有效性。

我有下面的Nodejs脚本发起100万个承诺,这些诺言将睡觉2秒钟,以模拟密集的后端API呼叫。该脚本运行了一段时间(〜30s),最多可消耗4096 MB的RAM,并扔出JavaScript堆错误。

  1. 承诺真的会消耗那么多记忆吗?
  2. 当Nodejs使用太多内存时,Nodejs如何对I/O密集型操作有益?
  3. Golang仅使用10MB的内存来处理1亿GO例程,在处理I/O密集操作时,Golang是否比Nodej还要好吗?
const sleep = async (ms) => new Promise((resolve) => setTimeout(resolve, ms));

const fakeAPICall = async (i) => {
  await sleep(2000);
  return i;
};

const NUM_OF_EXECUTIONS = 1e7;
console.time(`${NUM_OF_EXECUTIONS} executions:`);

[...Array(NUM_OF_EXECUTIONS).keys()].forEach((i) => {
  fakeAPICall(i).then((r) => {
    if (r === NUM_OF_EXECUTIONS - 1) {
      console.timeEnd(`${NUM_OF_EXECUTIONS} executions:`);
    }
  });
});

错误

<--- Last few GCs --->

[41215:0x10281b000]    36071 ms: Mark-sweep (reduce) 4095.5 (4100.9) -> 4095.3 (4105.7) MB, 5864.0 / 0.0 ms  (+ 1.3 ms in 2767 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 7190 ms) (average mu = 0.296, current mu = 0.[41215:0x10281b000]    44534 ms: Mark-sweep (reduce) 4096.3 (4104.7) -> 4096.3 (4105.7) MB, 8461.4 / 0.0 ms  (average mu = 0.140, current mu = 0.000) allocation failure scavenge might not succeed


<--- JS stacktrace --->

FATAL ERROR: MarkCompactCollector: young object promotion failed Allocation failed - JavaScript heap out of memory
 1: 0x100098870 node::Abort() [/usr/local/opt/node@14/bin/node]
 2: 0x1000989eb node::OnFatalError(char const*, char const*) [/usr/local/opt/node@14/bin/node]
 3: 0x1001a6d55 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/usr/local/opt/node@14/bin/node]
 4: 0x1001a6cff v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/usr/local/opt/node@14/bin/node]
 5: 0x1002dea5b v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/usr/local/opt/node@14/bin/node]
 6: 0x100316819 v8::internal::EvacuateNewSpaceVisitor::Visit(v8::internal::HeapObject, int) [/usr/local/opt/node@14/bin/node]

I'm trying to analyze how effective the NodeJS is in handling async functions.

I have the NodeJS script below to initiate 10 millions of Promises which will sleep for 2 seconds to simulate an intensive backend API calls. The script run for a while (~30s), consumed up to 4096 MB of ram and threw JavaScript heap out of memory error.

  1. Does the Promises really consume that much memory ?
  2. How come the NodeJS is supposed to be good for I/O intensive operations when it uses too much memory ?
  3. Golang only uses 10MB of memory to handle 100 millions of Go Routines, is Golang even better than NodeJS in handling I/O intensive operations ?
const sleep = async (ms) => new Promise((resolve) => setTimeout(resolve, ms));

const fakeAPICall = async (i) => {
  await sleep(2000);
  return i;
};

const NUM_OF_EXECUTIONS = 1e7;
console.time(`${NUM_OF_EXECUTIONS} executions:`);

[...Array(NUM_OF_EXECUTIONS).keys()].forEach((i) => {
  fakeAPICall(i).then((r) => {
    if (r === NUM_OF_EXECUTIONS - 1) {
      console.timeEnd(`${NUM_OF_EXECUTIONS} executions:`);
    }
  });
});

ERROR

<--- Last few GCs --->

[41215:0x10281b000]    36071 ms: Mark-sweep (reduce) 4095.5 (4100.9) -> 4095.3 (4105.7) MB, 5864.0 / 0.0 ms  (+ 1.3 ms in 2767 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 7190 ms) (average mu = 0.296, current mu = 0.[41215:0x10281b000]    44534 ms: Mark-sweep (reduce) 4096.3 (4104.7) -> 4096.3 (4105.7) MB, 8461.4 / 0.0 ms  (average mu = 0.140, current mu = 0.000) allocation failure scavenge might not succeed


<--- JS stacktrace --->

FATAL ERROR: MarkCompactCollector: young object promotion failed Allocation failed - JavaScript heap out of memory
 1: 0x100098870 node::Abort() [/usr/local/opt/node@14/bin/node]
 2: 0x1000989eb node::OnFatalError(char const*, char const*) [/usr/local/opt/node@14/bin/node]
 3: 0x1001a6d55 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/usr/local/opt/node@14/bin/node]
 4: 0x1001a6cff v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/usr/local/opt/node@14/bin/node]
 5: 0x1002dea5b v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/usr/local/opt/node@14/bin/node]
 6: 0x100316819 v8::internal::EvacuateNewSpaceVisitor::Visit(v8::internal::HeapObject, int) [/usr/local/opt/node@14/bin/node]

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

四叶草在未来唯美盛开 2025-02-13 16:15:57

nodejs具有默认的内存限制,可以使用-max_old_space_size =&lt; MB&gt; node选项中的内存更改;

我有下面的nodejs脚本来启动 100亿Promises


甚至没有接近。其中大约有5000万。

const sleep = async (ms) => { // redundant async - Promise#1
  return new Promise((resolve) => setTimeout(resolve, ms)); // Promise#2
}

const fakeAPICall = async (i) => { // async - Promise#3
  await sleep(2000); // await - Promise#4
  return i;
};

const NUM_OF_EXECUTIONS = 1e7;

console.time(`${NUM_OF_EXECUTIONS} executions:`);

for (let i = 0; i < NUM_OF_EXECUTIONS; i++) {
  fakeAPICall(i).then((r) => { // then - Promise#5
    if (r === NUM_OF_EXECUTIONS - 1) {
      console.timeEnd(`${NUM_OF_EXECUTIONS} executions:`);
    }
  });
}

在每次迭代中,您实际上创建了至少5个承诺和一个发电机,因此您有5000万个承诺和大量其他对象。这是很多,因为它们是用JS编写的纯JS对象,当然,它们比低级预编译语言所消耗的记忆更多。节点与低内存消耗无关,而是在您的情况下成为瓶颈。
如果您需要内存优化,则可以易于使用 - 纯回调可能会更便宜。

在这里,我们创建1000万承诺:

const NUM_OF_EXECUTIONS = 5_000_000;

console.log(`Start `, NUM_OF_EXECUTIONS);

const sleep = (ms, i) => new Promise((resolve) => setTimeout(resolve, ms, I)); // Promise#1

console.time(`${NUM_OF_EXECUTIONS} executions`);

for (let i = 0; i < NUM_OF_EXECUTIONS; i++) {
  sleep(2000, i).then((r) => { // then - Promise#2
    if (r === NUM_OF_EXECUTIONS - 1) {
      console.timeEnd(`${NUM_OF_EXECUTIONS} executions`);
    }
  });
}

内存(2.6 GB):

Start  5000000
{
  rss: '2.72 GB',
  heapTotal: '2.68 GB',
  heapUsed: '2.6 GB',
  external: '308 kB',
  arrayBuffers: '10.4 kB'
}
5000000 executions: 24.776s

Process finished with exit code 0

Nodejs has a default memory limit which can be changed with the --max_old_space_size=<memory in MB> NODE option;

I have the NodeJS script below to initiate 10 millions of Promises

Not even close. There are about 50 million of them.

const sleep = async (ms) => { // redundant async - Promise#1
  return new Promise((resolve) => setTimeout(resolve, ms)); // Promise#2
}

const fakeAPICall = async (i) => { // async - Promise#3
  await sleep(2000); // await - Promise#4
  return i;
};

const NUM_OF_EXECUTIONS = 1e7;

console.time(`${NUM_OF_EXECUTIONS} executions:`);

for (let i = 0; i < NUM_OF_EXECUTIONS; i++) {
  fakeAPICall(i).then((r) => { // then - Promise#5
    if (r === NUM_OF_EXECUTIONS - 1) {
      console.timeEnd(`${NUM_OF_EXECUTIONS} executions:`);
    }
  });
}

In each iteration, you actually create at least 5 promises and one generator, so you have 50 million promises and a huge amount of other objects in memory. This is a lot since they are pure JS objects written in JS and of course, they consume more memory than low-level precompiled languages. Node is not about low memory consumption, but memory becomes the bottleneck in your case.
Promises are made for ease of use, if you need memory optimization - pure callbacks can be cheaper.

Here we create 10M promises:

const NUM_OF_EXECUTIONS = 5_000_000;

console.log(`Start `, NUM_OF_EXECUTIONS);

const sleep = (ms, i) => new Promise((resolve) => setTimeout(resolve, ms, I)); // Promise#1

console.time(`${NUM_OF_EXECUTIONS} executions`);

for (let i = 0; i < NUM_OF_EXECUTIONS; i++) {
  sleep(2000, i).then((r) => { // then - Promise#2
    if (r === NUM_OF_EXECUTIONS - 1) {
      console.timeEnd(`${NUM_OF_EXECUTIONS} executions`);
    }
  });
}

Memory (2.6 GB):

Start  5000000
{
  rss: '2.72 GB',
  heapTotal: '2.68 GB',
  heapUsed: '2.6 GB',
  external: '308 kB',
  arrayBuffers: '10.4 kB'
}
5000000 executions: 24.776s

Process finished with exit code 0

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文