如何使用 PHP 中的 cURL 同时打开多个 URL?
这是我当前的代码:
$SQL = mysql_query("SELECT url FROM urls") or die(mysql_error()); //Query the urls table
while($resultSet = mysql_fetch_array($SQL)){ //Put all the urls into one variable
// Now for some cURL to run it.
$ch = curl_init($resultSet['url']); //load the urls
curl_setopt($ch, CURLOPT_TIMEOUT, 2); //No need to wait for it to load. Execute it and go.
curl_exec($ch); //Execute
curl_close($ch); //Close it off
} //While loop
我对 cURL 比较陌生。我所说的“相对较新”是指这是我第一次使用 cURL。目前,它会加载一个 2 秒,然后加载下一个 2 秒,然后加载下一个。但是,我想让它同时加载所有这些。我确信这是可能的,我只是不确定如何实现。如果有人能指出我正确的方向,我将不胜感激。
Here is my current code:
$SQL = mysql_query("SELECT url FROM urls") or die(mysql_error()); //Query the urls table
while($resultSet = mysql_fetch_array($SQL)){ //Put all the urls into one variable
// Now for some cURL to run it.
$ch = curl_init($resultSet['url']); //load the urls
curl_setopt($ch, CURLOPT_TIMEOUT, 2); //No need to wait for it to load. Execute it and go.
curl_exec($ch); //Execute
curl_close($ch); //Close it off
} //While loop
I'm relatively new to cURL. By relatively new, I mean this is my first time using cURL. Currently it loads one for two seconds, then loads the next one for 2 seconds, then the next. however, I want to make it load ALL of them at the same time. I'm sure its possible, I'm just unsure as to how. If someone could point me in the right direction, I'd appreciate it.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您以相同的方式设置每个 cURL 句柄,然后将它们添加到
curl_multi_
句柄中。要查看的函数是 记录在此处的curl_multi_*
函数。不过,根据我的经验,尝试一次加载太多 URL 会出现问题(尽管我目前找不到关于它的注释),所以上次我使用curl_mutli_
时,我将其设置为一次批量处理 5 个 URL。编辑:这是我使用
curl_multi_
的代码的简化版本:编辑:稍微重写了很多内容添加了评论,希望能有所帮助。
鉴于您不需要从 URL 返回任何内容,您可能不需要很多内容,但这就是我将请求分成
BLOCK_SIZE
块的方式,等待每个块在继续之前运行,并捕获来自 cURL 的错误。You set up each cURL handle in the same way, then add them to a
curl_multi_
handle. The functions to look at are thecurl_multi_*
functions documented here. In my experience, though, there were issues with trying to load too many URLs at once (though I can't find my notes on it at the moment), so the last time I usedcurl_mutli_
, I set it up to do batches of 5 URLs at a time.edit: Here is a reduced version of the code I have using
curl_multi_
:edit: Slightly rewritten and lots of added comments, which hopefully will help.
Given that you don't need anything back from the URLs, you probably don't need a lot of what's there, but this is how I chunked the requests into blocks of
BLOCK_SIZE
, waited for each block to run before moving on, and caught errors from cURL.