使用php向mysql发送2300个请求需要2小时,如何优化
我正在做一个新闻报纸项目,但没有Rss Feed,所以我被迫使用php以编程方式制作它的提要,所以我有2300个处理页面的过程并将处理结果插入Mysql,
所以我使用的技术是处理每一页然后将其内容插入 mysql 中,它运行良好,但有时我得到“MySQL 服务器消失了”,
我尝试处理 30 页并将它们插入到一个请求中,但它一段时间后停止
,所以我询问有什么方法可以优化此处理以减少 ?! 中使用的时间
多谢
am working on a news paper project but without Rss Feed, so i was Compelled to make it's feed programatically using php , so i have 2300 process of processing pages and inserting in Mysql the results of the processing ,
so the technique i used is to process every single page and then insert it's contents in mysql , it's working good but some times i got "MySQL server gone" ,
i tried to process 30 page and insert them in one request but it stop's after some time
so i am asking about any way to optimize this processing to reduce the time used in ?!
thanks alot
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
您的批量插入方法是正确的并且可能会有所帮助。您需要找出为什么它会像您所说的那样在一段时间后停止。
很可能是 php 脚本超时。在 php.ini 文件中查找 max_execution_time 以确保它是足够高以允许脚本完成。
另外,请确保您的 mysql 配置允许 足够大的数据包 因为您发送的批次可能很大。
希望有帮助!
Your batch insert approach is correct and likely to help. You need to find out why it stops after some time like you say.
It is likely the php script timing out. Look for max_execution_time on your php.ini file to make sure it's high enough to allow for the script to finish.
Also, make sure your mysql config allows for a large enough packet because you're sending large batches which may be large.
Hope that helps!
“MySQL 服务器消失了”的原因有很多。 看一下。
无论如何,加载整个页面很奇怪。通常 RSS 提要意味着您只放一个主题和一些文本片段。我将 RSS 提要创建为简单的 XML 文件,因此无需在用户每次点击时都从 MySQL 加载数据。你创造新闻->重新生成 RSS XML 文件,您写了新文章 ->重新生成 RSS XML 文件。
如果您仍想准备要插入的数据,只需创建一个包含所有插入的文件,然后从此文件加载数据。
是的!一次全部 2300 个;)
最后一行需要反引号!
所以,正如您所看到的,它运行得非常完美。
There are plenties of reasons why "MySQL server has gone away". Take a look.
Anyway, it is strange that you load the WHOLE pages. Usually RSS feed implies that you put there just a subject and some text snippet. I'd create RSS feed as simple XML file so it is not necessary to load data from MySQL on EVERY hit users do. You create news -> regenerate RSS XML file, you wrote new article -> regenerate RSS XML file.
If you still want to prepare your data to be inserted, just create a file with ALL inserts and then load data from this file.
Yes! all 2300 at a time ;)
Backticks are necessary in the last line!
So, as you can see, it worked perfectly.