增加 PHP 内存限制。什么时候会变得疯狂?

发布于 2024-08-04 22:56:17 字数 279 浏览 5 评论 0原文

在我当前正在开发的系统中,有一个进程将大量数据加载到数组中以进行排序/聚合/其他操作。我知道这个过程需要针对内存使用进行优化,但在短期内它只需要工作即可。

考虑到加载到数组中的数据量,我们不断达到内存限制。它已经增加了好几次,我想知道是否有一点增加它通常会成为一个坏主意?还是取决于机器有多少内存?

该机器有 2GB RAM,当前内存限制设置为 1.5GB。我们可以轻松地为机器添加更多 RAM(无论如何都会这样做)。

其他人遇到过这种问题吗?解决方案是什么?

In a system I am currently working on, there is one process that loads large amount of data into an array for sorting/aggregating/whatever. I know this process needs optimising for memory usage, but in the short term it just needs to work.

Given the amount of data loaded into the array, we keep hitting the memory limit. It has been increased several times, and I am wondering is there a point where increasing it becomes generally a bad idea? or is it only a matter of how much RAM the machine has?

The machine has 2GB of RAM and the memory_limit is currently set at 1.5GB. We can easily add more RAM to the machine (and will anyway).

Have others encountered this kind of issue? and what were the solutions?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

你的往事 2024-08-11 22:56:17

作为 Apache 模块运行以服务器网页的 PHP 的 memory_limit 配置必须考虑到计算机上可以同时拥有多少个 Apache 进程 - 请参阅 MaxClients< /code> Apache 的配置选项。

如果 MaxClients 为 100 并且您有 2,000 MB RAM,则快速计算将显示您不应使用超过 20 MB *(因为 20 MB * 100 个客户端 = 2 GB 或 RAM,即您的服务器拥有的内存总量)* 作为内存限制值。

这还没有考虑到同一台服务器上可能还运行着其他东西,比如 MySQL、系统本身……而且 Apache 可能已经在为自己使用一些内存了。

或者当然,这也是“最坏情况”,即考虑到每个 PHP 页面都使用了它可以使用的最大内存量。

就您而言,如果您只需要一项作业需要如此大的内存,我不会增加作为 Apache 模块运行的 PḦP 的内存限制。

相反,我会从命令行启动该作业(或通过 cron 作业),并在这唯一的情况下指定更高的内存限制。

这可以通过 php 的 -d 选项来完成,例如:

$ php -d memory_limit=1GB temp.php
string(3) "1GB"

考虑到,在这种情况下,temp.php 仅包含:

var_dump(ini_get('memory_limit'));

在我看来,这比增加 PHP 模块的 memory_limit 更安全对于 Apache —— 当我有一个大数据集,或者一些我无法优化或分页的非常重的东西时,我通常会这样做。

如果您需要为 PHP CLI 执行定义多个值,您还可以使用 -c 选项告诉它使用另一个配置文件,而不是默认的 php.ini:

php -c /etc/phpcli.ini temp.php

这样,您就拥有了:

  • /etc/php.ini for Apache,具有低memory_limit、低max_execution_time,...
  • /etc/phpcli。 ini 用于从命令行运行批处理,几乎没有限制

这可确保您的批处理能够运行 - 并且您的网站仍然具有安全性(memory_limitmax_execution_time 是安全措施)

不过,如果您有时间优化脚本,您应该;例如,在必须处理大量数据的情况下,分页是必须的;-)

The configuration for the memory_limit of PHP running as an Apache module to server webpages has to take into consideration how many Apache process you can have at the same time on the machine -- see the MaxClients configuration option for Apache.

If MaxClients is 100 and you have 2,000 MB of RAM, a very quick calculation will show that you should not use more than 20 MB *(because 20 MB * 100 clients = 2 GB or RAM, ie the total amount of memory your server has)* for the memory_limit value.

And this is without considering that there are probably other things running on the same server, like MySQL, the system itself, ... And that Apache is probably already using some memory for itself.

Or course, this is also a "worst case scenario", that considers that each PHP page is using the maximum amount of memory it can.

In your case, if you need such a big amount of memory for only one job, I would not increase the memory_limit for PḦP running as an Apache module.

Instead, I would launch that job from command-line (or via a cron job), and specify a higher memory_limit specificaly in this one and only case.

This can be done with the -d option of php, like :

$ php -d memory_limit=1GB temp.php
string(3) "1GB"

Considering, in this case, that temp.php only contains :

var_dump(ini_get('memory_limit'));

In my opinion, this is way safer than increasing the memory_limit for the PHP module for Apache -- and it's what I usually do when I have a large dataset, or some really heavy stuff I cannot optimize or paginate.

If you need to define several values for the PHP CLI execution, you can also tell it to use another configuration file, instead of the default php.ini, with the -c option :

php -c /etc/phpcli.ini temp.php

That way, you have :

  • /etc/php.ini for Apache, with low memory_limit, low max_execution_time, ...
  • and /etc/phpcli.ini for batches run from command-line, with virtually no limit

This ensures your batches will be able to run -- and you'll still have security for your website (memory_limit and max_execution_time being security measures)

Still, if you have the time to optimize your script, you should ; for instance, in that kind of situation where you have to deal with lots of data, pagination is a must-have ;-)

伪装你 2024-08-11 22:56:17

您是否尝试过将数据集分割成较小的部分并一次仅处理一部分?

如果从磁盘文件中获取数据,可以使用 fread() 加载较小块的函数,或一些 某种无缓冲数据库如果是数据库,则查询

自 v3.something 以来我就没有检查过 PHP,但您也可以使用某种形式的云计算。 1GB 的数据集似乎足够大,可以在多台机器上处理。

Have you tried splitting the dataset into smaller parts and process only one part at the time?

If you fetch the data from a disk file, you can use the fread() function to load smaller chunks, or some sort of unbuffered db query in case of database.

I haven't checked up PHP since v3.something, but you also could use a form of cloud computing. 1GB dataset seems to be big enough to be processed on multiple machines.

云归处 2024-08-11 22:56:17

鉴于您知道脚本存在需要修复的内存问题,并且您只是在寻找短期解决方案,那么我不会解决 进行分析并解决您的内存问题。听起来你会做到这一点。

因此,我想说您必须记住的主要事情是:

  • 系统上的总内存负载
  • 操作系统功能

PHP 只是系统的一小部分。如果您允许它消耗大量 RAM,那么其他进程将受到影响,这反过来可能会影响脚本本身。值得注意的是,如果您从数据库中提取大量数据,那么您的 DBMS 可能需要大量内存才能为查询创建结果集。作为快速解决方案,您可能需要识别正在运行的任何查询并尽快释放结果,以便为长时间作业运行提供更多内存。

就操作系统功能而言,您应该记住,您可能运行的 32 位系统在没有特殊处理的情况下最多只能寻址 4GB RAM。通常,根据使用方式,限制可能会小得多。某些 Windows 芯片组和配置实际上可供系统使用的内存不足 3GB,即使物理安装了 4GB 或更多内存也是如此。您应该检查您的系统可以寻址多少。

你说你已经多次增加了内存限制,所以显然这个工作的范围越来越大。如果你的内存达到 1.5Gb,那么即使安装 2Gb 以上的 RAM 听起来也只是暂时的缓解。

有没有其他人遇到过这种情况
问题?解决方案是什么?

我想您可能已经知道,唯一真正的解决方案是尽快分解并花时间优化脚本,否则您最终会得到一份太大而无法运行的工作。

Given that you know that there are memory issues with your script that need fixing and you are only looking for short-term solutions, then I won't address the ways to go about profiling and solving your memory issues. It sounds like you're going to get to that.

So, I would say the main things you have to keep in mind are:

  • Total memory load on the system
  • OS capabilities

PHP is only one small component of the system. If you allow it to eat up a vast quantity of your RAM, then the other processes will suffer, which could in turn affect the script itself. Notably, if you are pulling a lot of data out of a database, then your DBMS might be require a lot of memory in order to create result sets for your queries. As a quick fix, you might want to identify any queries you are running and free the results as soon as possible to give yourself more memory for a long job run.

In terms of OS capabilities, you should keep in mind that 32-bit systems, which you are likely running on, can only address up to 4GB of RAM without special handling. Often the limit can be much less depending on how it's used. Some Windows chipsets and configurations can actually have less than 3GB available to the system, even with 4GB or more physically installed. You should check to see how much your system can address.

You say that you've increased the memory limit several times, so obviously this job is growing larger and larger in scope. If you're up to 1.5Gb, then even installing 2Gb more RAM sounds like it will just be a short reprieve.

Have others encountered this kind of
issue? and what were the solutions?

I think you probably already know that the only real solution is to break down and spend the time to optimize the script soon, or you'll end up with a job that will be too big to run.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文