将.csv有效导入mysql到几个相关表
你能告诉我如何有效地让用户将他们的数据导入到mysql中吗? 问题是数据通常需要插入到几个相关的表中。 导入数十或数十万行的 .csv 需要花费大量时间,并对数据库产生大量负载。现在,我解析 .csv,生成插入(如果我们需要在相关表中设置属性,则可能需要多次插入)并在数据库中循环插入数据。 你怎么做这样的事情? 也许在服务器上加载文件并定期在服务器上插入数据? 所有想法均受到赞赏。 谢谢。
Could you please tell me how to effectively allow the user to import their data in mysql?
The problem is that data generally need to be inserted in several related tables.
Importing importing .csv of severl 10s or 100s thousand of line take much time and generate large load for database. Now I parse .csv, generate insets (maybe several inserts if we need to set attrributes in related table) and in loop insert data in tha database.
How do you do such things?
Maybe to load file on server and on the server periodacally to insert data by little portions?
All ideas are appreciated.
Thank you.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
如果您确实需要插入所有这些数据,我认为您没有太多选择。
我建议使用一次 INSERT 插入多行,以减少应用程序和数据库之间的往返次数:
如果您是第一次插入数据,则可以在插入所有数据后创建索引,但是当然,如果您在线执行此操作(即您的插入过程与其他操作同时执行),那么您将无法执行此操作,因为索引在所有进程之间共享。
If you really need to insert all this data, I don't think you have much of a choice.
I would recommend using inserting multiple rows with one INSERT to reduce the number of round-trips between the application and the database:
If you are inserting the data for the first time, you could create the indexes after all the data has been inserted, but of course if you're doing this online (i.e. your insertion process is executing concurrently with other operations) then you can't do that as the indexes are shared between all processes.