NextJs 应用程序在大流量下的行为

发布于 2025-01-16 19:16:12 字数 311 浏览 4 评论 0 原文

我正在开发 NextJs 应用程序。它将部署在 Azure 上,我将在其中安装 Nodejs,并将运行 next start -p 4000 命令。

我想知道NextJs如何处理大流量?也就是说,如果有大约 2 万个用户浏览我的网站,Nextjs 是否可以开箱即用地处理这种情况,或者我应该使用多个 nextjs 应用程序来 dockerize 和编排多个 nodejs docker 映像?

或者,Nextjs 是否向我的 CDN 提供静态文件,以便我不必关心运行 nextjs 服务器的 NodeJS 的流量压力?

希望我的问题有意义。

I am developing NextJs application. It will be deployed on Azure where I will have nodejs installed and will run next start -p 4000 command.

What I would like to know is how does NextJs handle heavy traffic? Namely, if there are something like 20k users going through my site, is this something that Nextjs can handle out of the box or should I dockerize and orchestrate multiple nodejs docker images with multiple nextjs applications?

Or, is Nextjs serving static files to my CDN so that I do not have to care about traffic stress of my nodejs where nextjs server is running?

Hope my question makes sense.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

不及他 2025-01-23 19:16:12

没有神奇的数字

没有可以凭空想象的容量限制的固定数字。其次,Node.js 应用程序通常在处理多个连接方面非常高效,但负载的重量取决于您的站点。您可以“处理”多少个同时连接还取决于您认为可以接受的延迟时间。例如,您的服务器可能能够处理 40k 个并发请求(延迟为 1 秒),但只能处理 5k 个并发请求(延迟为 100 毫秒)。

影响容量的因素

您的服务器可以处理的流量取决于以下因素:

  • 您的服务器执行的 IO 量。这包括发送到浏览器的数据,以及从磁盘或数据库读取的数据。如果您有大量静态内容(例如大图像、视频)被提供,这可能会开始限制您。
  • 服务器执行的处理量。这是运行每个 API 调用所需的代码量。通常这个值相当低,并且大多数服务器都是 IO 密集型的,但有时需要进行大量处理(例如,从数据库中检索大型数据集并对其进行转换)。
  • 运行服务器的机器的处理能力。在速度较慢的机器上,您的所有处理都会变慢(千兆赫越少意味着速度越慢),因此您所做的处理(如上所述)将需要更长的时间来运行,这意味着您将阻塞新连接的时间更长,这将降低你的服务器。
  • 运行服务器的机器的 IO 速度。如果您的服务器进行任何磁盘访问,这包括磁盘速度,否则主要与网络速度有关。现在是 2022 年,所以网络速度很少会再成为限制您的应用程序的因素,因此除非您正在进行磁盘访问,否则请忽略这一点。
  • 您的操作系统支持的连接数。每个操作系统都有一个内置的硬限制(无法更改的最大值),有时还有一个软限制(可以增加的默认限制)。

估计容量

理论上,您的开发计算机应该比生产服务器慢,因此您可以通过负载测试来获得服务器容量的下限。您可以使用 Autocannon负载测试以评估您的容量。如果一开始只有几个同时连接并逐渐增加,您应该会看到延迟突然增加(在此之前延迟应该或多或少保持一致)。这是你开始达到极限的时候。

扩展容量

Threadpool

Node.js 是单线程的,但异步调用在 Lib 中运行UV线程池。当 Node.js 等待 IO 时,有一个 LibUV 线程在后台旋转。当 Lib UV 线程池已满时,Node.js 必须等待另一个可用,然后才能启动另一个异步 IO 任务,这会减慢一切。
Node.js 中的默认线程池大小非常小(以前为 4),因此增加它可能非常有益。您可以在此处此处。

其他问题

因为您在问题中特别提到了 Docker,所以请记住 Docker 只是一种部署策略,其本身并不能帮助减轻任何负载。如果您受到线程池限制,那么对同一台计算机上的多个 Docker 实例进行负载平衡将加快您的进程,直到您达到其他上限之一。如果您已经受到 CPU 限制或 IO 限制,那么在同一服务器上运行多个实例将无济于事。此时,您需要垂直扩展服务器计算机或添加更多计算机。

No magic number

There is no set number for capacity limit that can be pulled out of a hat. Next, and Node.js apps in general, are pretty efficient at handling multiple connections, but how heavy your load is depends on your site. How many simultaneous connections you can "handle" also depends on how much latency you find acceptable. For example, your server may be able to handle 40k simultaneous requests with 1 second of latency, but only 5k simultaneous requests with 100ms of latency.

Factors affecting capacity

How much traffic your server can handle will depend on things like:

  • Amount of IO your server does. This includes data being sent to browsers, as well as data being read from disk or from a database. If you have a lot of static content (e.g. large images, videos) being served, this will probably start to limit you.
  • Amount of processing your server does. This is how much code needs to run every API call. Usually this is pretty low and most servers are IO-bound, but sometimes there is a lot of processing (e.g. retrieving large data set from database and transforming it).
  • Processing capacity of the machine upon which your server runs. All of your processing will be slower on a slower machine (fewer gigahertz means slower), so the processing that you do (described above) will take longer to run, which means you will block new connections for longer, which will lower the capacity of your server.
  • IO speed of the machine upon which your server runs. This includes disk speed if your server does any disk access, otherwise it's mostly about network speed. It's 2022 so network speed will rarely be what's limiting your app anymore, so unless you're doing disk access, then ignore this point.
  • Number of connections supported by your OS. Every OS has a built-in hard limit (the maximum that cannot be changed) and sometimes also a soft limit (a default limit which can be increased).

Estimating capacity

Your dev machine should, theoretically, be slower than your production server, so you can get a lower bound on the capacity of your server by load testing it. You can use tools like Autocannon or Loadtest to ballpark your capacity. If you start with only a few simultaneous connections and ramp up, you should reach a point where you see the latency suddenly increase (latency should be more or less consistent until then). This is when you are starting to hit a limit.

Expanding your capacity

Threadpool

Node.js is single-threaded, but asynchronous calls run in the Lib UV thread pool. When Node.js is waiting on IO, there is a LibUV thread spinning behind the scenes. When the Lib UV thread pool is full, Node.js has to wait for another to become available before another async IO task can be started, which slows everything down.
The default thread pool size in Node.js is quite small (used to be 4), so increasing it can be quite beneficial. You can find more information on tuning the LibUV threadpool size here and here.

Other concerns

Because you specifically mentioned Docker in your question, remember that Docker is only a deployment strategy, and does not by itself help alleviate any load. If you're bound by a threadpool limit, then load balancing to multiple Docker instances on the same machine will speed up your process until you hit one of the other caps. If you're already CPU-bound or IO-bound, then multiple instances running on the same server won't help. At this point you'll need to either vertically scale your server machine or add more machines.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文