php 中有某种可用的线程吗?
我有一个 php 脚本,它从 mysql 数据库查询客户端列表,然后访问每个客户端的 IP 地址并获取一些信息,然后将其显示在网页上。
但是,如果客户数量太多,则需要很长时间。无论如何,我可以并行发送这些 url 请求 (file_get_contents) 吗?
I have a php script which queries a list of clients from a mysql database, and goes to each client's IP address and picks up some information which is then displayed on the webpage.
But, it takes a long time, if the number of clients is too high. Is there anyway, I can send those url requests (file_get_contents) in parallel?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
我会使用类似 Gearman 的东西将它们分配为队列中的作业,以便工作人员来完成,如果这需要扩大规模。
作为另一个选择,我还为 编写了一个 PHP 包装器 ="http://unixhelp.ed.ac.uk/CGI/man-cgi?at" rel="nofollow">Unix at queue,这可能适合这个问题。它允许您安排请求,以便它们可以并行运行。我过去曾成功地使用此方法来处理批量电子邮件的发送,这与您的脚本存在类似的阻塞问题。
I would use something like Gearman to assign them as jobs in a queue for workers to come along and complete if this needs to scale.
As another option I have also written a PHP wrapper for the Unix at queue, which might be a fit for this problem. It would allow you to schedule the requests so that they can run in parallel. I have used this method successfully in the past to handle the sending of bulk email, which has similar blocking problems to your script.
Lineke Kerckhoffs-Willems 写了一篇关于 的好文章PHP 中使用 CURL 的多线程。您可以使用它代替
file_get_contents()
来获取所需的信息。Lineke Kerckhoffs-Willems wrote a good article about Multithreading in PHP with CURL. You can use that instead of
file_get_contents()
to get needed information.