脚本导致超时

发布于 2024-12-08 12:01:45 字数 714 浏览 0 评论 0原文

我有一个脚本有时导致超时。运行需要一段时间,我将解释它的作用: 我们的用户数量相当少(20 个),我们为所有这些用户管理库存。库存由第三方每天早上(例如上午 6:00 左右)通过 ftp 以 .csv 文件发送。库存包括物品描述,然后是可变长度的图像 URL 列表。 我们的系统需要下载它尚未拥有的所有图像(99.9% 的情况仅在库存提要中有新商品时才会发生)。通常,库存 Feed 95% 是相同的,因为大部分库存一天到一天都卖不出去。

棘手的部分是,每天早上我们的系统都会查看每个库存项目,并根据新的提要交叉检查每个项目的图像列表。如果图像不存在,它会使用 CURL 操作引入新图像。

正如您可以想象的那样,根据具体日期,这可能是一项相当耗时的操作。我把它放在一个 cron 作业上。如果我手动运行它,则需要 1-5 分钟,具体取决于负载,并且有时(例如,每 5 次尝试一次)它会给出“内部服务器错误”,并且没有任何解释。

我在文件中首先使用 set_time_limit(0) 指令,所以我想知道是否还需要做其他事情来确保它不会超时?或者你们是否认为失败的传输可能会导致问题并导致脚本在某些情况下死掉?就像可能是一次处理不当的失败转移——我不知道。我没有发布所有代码,而是想知道是否可以先得到一些想法,因为脚本非常复杂,我不想浪费任何人的时间。

欢迎任何想法。我想不出为什么它间歇性地不起作用。 根据记录,如果我手动运行它两次,第二次它总是可以工作,但我认为这是因为第一次运行已经处理了大部分下载......

I have a script that is causing timeouts, some of the time. It takes a while to run, and I'll explain what it does:
We have a fairly small number of users (put it at 20), and we manage inventory for all of those users. Inventories are sent via ftp by a third party every morning (say 6:00 AM or so) as .csv files. Inventories include item descriptions, and then a variable length list of URLs for images.
Our system is required to download any images that it doesnt already have (which 99.9% of the time only happens when a new item is in the inventory feed). Usually the inventory feeds are 95% the same, since most of the inventory doesn't sell from one day to the next.

The tricky part is that every morning our system looks at every inventory item and cross-checks each item's image list against the new feed. If images don't exist, it brings the new ones over using a CURL operation.

As you can imagine, depending on the day, this could be a rather time-consuming operation. I have it on a cron job. If I run it manually, it takes anywhere between 1-5 minutes depending on the load, and sometimes (as in, once in every 5 tries) it gives an "internal server error" with no explanation whatsoever.

I am using the set_time_limit(0) directive first thing in the file so I'm wondering if there's something else I need to do to ensure that it's not going to time out? Or do you guys think there's a possibility that failed transfers could be causing issues and making the script die on certain occasions? Like maybe a failed transfer that's handled badly-- I don't know. Rather than post all the code I'm wondering if I could get some ideas first since the script is pretty involved and I don't want to waste anyones time.

Any ideas are welcome. I can't think of why it's intermittently not working.
For the record, if I manually run it twice, it always works the second time, but I think that's because the first run already handled most of the downloads...

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

绝情姑娘 2024-12-15 12:01:45

检查日志以了解确切的 ISE 错误是什么。这可能会导致您检查 PHP 日志,具体取决于系统设置以获得更有限的错误。在没有任何日志的情况下猜测问题,只能是最好的猜测。

图像的文件大小是多少?不知何故,我认为这可能是由于我自己的转移失败。

Check the logs to see what the exact ISE error is. Which may lead you to check for PHP logs, depending on the system setup to get a more finite error. Guessing issues without any logs, is only as good as the best guess.

What are the file sizes of the images? Somehow, I think it might be due to a transfer failing, myself.

橘寄 2024-12-15 12:01:45

无论 set_time_limit 如何,某些主机都会终止长时间运行的任务。尝试联系他们并询问此事。

Some hosts kill long-running tasks regardlesss of set_time_limit. Try contacting them and asking about it.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文