某种 php 刷新
首先让我描述一下我所做的:
我必须将大量数据从不同的 xml 导入到我的数据库中,因为它持续了很多时间,所以我必须放置一个进度条,我是这样做的:我分割了整个导入到很小的 AJAX 请求中,我一次导入很少的数据(当 ajax 请求完成时,进度条会增加一点)。整个想法很棒,但数据不断变得越来越大,我无法再优化代码(它已经尽可能优化了)。
问题是,每次我进行 AJAX 调用时,我都会在框架特定的事情(模型初始化等)、浏览器处理 url 等方面浪费很多时间。所以我想知道是否可以使用 php.ini 中的flush函数。
但我一直在读到刷新功能并不适用于所有浏览器(这很奇怪,因为它是服务器端功能)。如果我要使用刷新功能,我只需编写
那么,对于flush功能有什么看法吗?我一直在小脚本上测试它,但我想知道是否有人真的在大脚本上使用它。另外,我可以听取任何其他关于做我想做的事情的建议:)
Let me describe what I've made ar first:
I have to import large ammount of data from different xml's to my database and because it last a lot I had to put a progress bar and I did it like this: I split the whole import into tiny little AJAX requests and I import little data at a time (when an ajax request completes the progress bar increases a bit). This whole idea is great but the data just keeps getting bigger and bigger and I can't optimize the code anymore (it's as optimized as it gets).
The problem is that everytime I do a AJAX call I lose a lot of time with things specific to the framework (model initializations and stuff), with the browser handling the url and so on. So I was wondering if I could use the flush function from php.
But I've been reading that the flush function doesn't work great on all browsers (which is weird cause it's a server-side function). If I would use the flush function I would just write <script>increase_progressbar</script>
or whatever I want and I could do it.
So, any opinions on the flush function? I've been testing it on little scripts but I want to know if someone really used it with big scripts. Also, I can listen to any other suggestion of doing what I want to do :)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我不会给你直接的建议,但我会告诉你我是如何在我的一个项目中做到这一点的。就我而言,我需要上传 Excel 文件,然后解析它们。数据超过 3000 行,我必须检查每行的所有列以获取某些数据。当我上传后直接解析它时,解析器经常在某个地方崩溃,这确实不安全。
那我是怎么做到的呢?上传过程分为两部分:
物理上传文件(常规上传字段并提交)。当单击该按钮时,一些 CSS 和 JS“魔法”会隐藏表单,屏幕上会出现一个漂亮的加载栏。上传完成后,页面将刷新,并再次显示下一个文件的表单
开始使用 php-cli 在后台解析数据,正如@Dragon 建议的 exec() 一样。
在数据库中,我有一个表存储有关文件的信息,并且有一个名为“已解析”的布尔字段。当解析器完成工作时,最后一个任务是将该字段更新为 true。
这是从用户角度来看的整个过程:
在我的项目中,我不需要显示有关导入的额外详细信息,但您始终可以使用其他额外数据。
希望这对您的项目有所帮助。
I wont give you direct advise, but I will tell you how I did it in one of my projects. In my case I need to upload an Excel files and then parse them. The data exceeding 3000 rows and I had to check all columns of each row for some data. When I parse it directly after the upload, the parser often crashes somewhere and it was really not safe.
So how I did it? The upload process has been split in 2 parts:
Upload physically the file (regular upload field and submit). When the button is clicked some CSS and JS "magic" hide the form and one nice loading bar appears on the screen. When the upload has been done the page just refreshes and the form appear again for the next file
Start parsing the data on the background using php-cli as @Dragon suggest with exec().
In the database I had a table which stores information about the files and there is a boolean field called "parsed". When the parser finishes the job, the last task is to update that field to true.
So here is the whole process from user point of view:
In my project I doesn't have requirement to show extra details about the imports, but you can always go wild with other extra data.
Hope this help you with your project.