我创建了一个 php 脚本,可以读取包含数千个数据的 excel 表。通过插入查询将所有数据添加到MySQL数据库中。现在的问题是,Excel表格中的数据已经从几千变成了几百万。所以现在运行这个脚本时发生超时。
我想知道在不超时的情况下运行所有查询的最佳解决方案是什么?
还有一件事,我首先在网络托管上使用这个脚本,它在很少的记录上结束了超时,所以后来我在本地 WAMP 服务器上执行了这个脚本。它确实存储了比托管更多的记录,但仍然超时。因此,如果这个问题的解决方案可以在线工作,那么,本地服务器都有,那就最好了。
I have created a php script that reads an excel sheet of thousands of data & adds all the data into MySQL database through insert query. Now the issue is, the data inside excel sheet has gone from thousands to millions. So now timeout occurs while running this script.
I want to know what would be the best solution to run all the queries without timeout ?
Also one more thing, I used this script on a web hosting first, where it ended timeout on very few records, so later on I executed this script on my local WAMP server & it did stored more records than the hosting, but still it went timeout. So if the solution to this can work on online & local servers both, it would be the best.
发布评论
评论(3)
我在我的脚本中使用这个:
i in my script use this:
您可以随时尝试设置:
[docs]
或 更合适 或者,将更多内容
交给 SQL,或者设置 cron 作业 以较小的间隔启动脚本,直到更新完成...
另外 请参阅此处,以帮助提高 SQL 效率,如果您'重新运行 INSERT,还请查看
LOAD DATA INFILE
:http://dev.mysql.com/doc/refman/5.0/en/insert-speed.html
You can always try setting:
[docs]
Or a more appropriate value
Alternatively, farm more out to SQL, or set up a cron job to launch the script in smaller intervals until the update is complete...
Also see here, to help make your SQL more efficient if you're running INSERTs, also take a look at
LOAD DATA INFILE
:http://dev.mysql.com/doc/refman/5.0/en/insert-speed.html
如果您有数百万行,则您可能使用的工具不正确。
如果可能,我建议使用
LOAD DATA INFILE
(docs) 函数,可以直接接受 csv 文件。这样您就可以完全跳过 PHP 层进行插入。如果您在将文件添加到数据库之前仍需要对其进行处理,您可以尝试将数据转换为所需的格式,保存为 csv,然后使用上述函数。If you have millions of rows, you might be using not the right tool.
If possible, I'd suggest using
LOAD DATA INFILE
(docs) functions, which can accept a csv file directly. That way you would skip the PHP layer for the insertion entirely. If you still need to process the file before adding it to the database, you could try to convert the data into needed format, save to csv and then use the above mentioned function.