我的共享服务器的全局 php.ini 将 cronjob 的 max_execution_time 限制为 30 秒,无法导入大文件

发布于 2024-10-08 11:29:45 字数 598 浏览 3 评论 0原文

我正在尝试设置一个 cron 作业,导入到我的数据库产品和类别中,我已经有脚本执行此作业,但我的主机公司已将 cron GLOBAL max_execution_time 限制为 30 秒,这不适用于 USER php.ini,其中 max_execution_time 是已设置为 4000。(浏览器请求) 因此,当我通过浏览器运行脚本时,我可以成功导入最多 5000 个产品而不会出现任何超时,但如果我使用 cron,它会在 30 秒后停止并仅导入 400 个产品。 我尝试将 ini_set ('max_execution_time... xxx 不起作用,服务器限制在 30 秒后阻止脚本。 我无法将文件切成很多小文件,因为该文件每晚都会更新,整天都这样做会很疯狂。 我可以运行更多作业 30+30+30,但由于脚本必须用新产品替换旧产品数量或详细信息,我不能告诉他不要覆盖现有产品。 我对此感到疯狂,产品每天更新非常重要,我们无法手动完成此操作。 既然我们不会仅仅为了加载 2mb 文件而购买 VPS 或 DS,那么我们还可以尝试什么其他解决方案呢?我们如何绕过这个限制?我可以以其他方式安排 cronimport.php 而不是使用 cron,或者可以使用 cron tru 另一个脚本调用它吗?真的很困惑:) 我为我糟糕的英语感到抱歉。

谢谢你!

i am tring to set up a cron job which import into my database products and categories, i have already the script doing this job but my host compay has limited cron GLOBAL max_execution_time to 30 seconds, this not apply for USER php.ini where max_execution_time is already set to 4000. (browser request)
For this reason when i run the script via browser i successfully can import up to 5000 products without getting any timeout, but if i use cron it stops after 30 seconds and import just 400 products.
I tried to put ini_set ('max_execution_time... xxx it doesnt works, the server limitation blocks the script after 30 seconds.
I cannot cut the file in many little ones because this file is updated everynight, would be crazy to do this all days.
I can run more jobs 30+30+30 but since the script HAS to replace the old products quantity or details with new ones i cannot tell him to dont override the existing products.
I am getting crazy about this, it's extremely important that the products are updated every days and we cannot do this manually.
Since we wont buy a VPS or DS just to load a 2mb file, what other kind of solution could we try? how we can bypass this limitation? can i schedule the cronimport.php in some other way instead of using cron, or maybe calling it with cron tru another script? really confused :)
I am sorry for my poor english.

Thank you!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

耳根太软 2024-10-15 11:29:45

如果这是关键任务功能,首先也是最重要的,我会寻找一个不施加此类限制的地方来托管它。

产品和类别从哪里来?您的脚本功能的本质是什么?你能总结一下吗?

当 cron 作业的执行窗口有限时,一般的经验法则是一次只提供一点点,即给它十行来处理,并将这些行标记为已处理,因此下一个周期从哪里开始它停止了。如果您无法延长时间限制,则需要找到一种方法来做到这一点。告诉我们您的代码实际上在做什么,也许我们可以建议一种方法。

干杯,

n.

If this is mission-critical functionality, first and foremost, I'd look for somewhere to host it that doesn't impose such limitations.

Where are the products and categories coming from? What's the essence of your script's functionality? Can you summarise?

When a cron job has a limited window to execute, the general rule of thumb is to only feed it a little bit at a time, ie give it ten rows to process, and mark those rows as processed, so the next cycle picks up where it left off. If you can't extend your time limit, you'll need to find a way to do this. Tell us what your code is actually doing, and perhaps we can suggest a way.

Cheers,

n.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文