最佳实践:将数百万条条目传输到另一个 MySQL 数据库
我分配了以下任务:
获取一个包含多个表和数十万条条目的当前数据库,并编写一个脚本,将大部分数据传输到具有不同结构的单独数据库中。
基本上,已经使用不同的表结构创建了一个新数据库,我需要(在确认所有数据类型都匹配后)编写一个脚本将数据复制到正确的表/列。
最终我想知道对于如此大规模的数据传输是否有任何最佳实践、首选语言或有人可以提供的提示/技巧?
I have the following task assigned to me:
Take a current database, with multiple tables and hundreds of thousands of entries and write a script that will transfer much of this data to a separate database with a different structure.
Basically a new database has been created with a different table structure, and I need to (after confirming that all the datatypes will match up) write a script to copy the data over to the correct table/column.
Ultimately I was wondering on such a massive data transfer is there are any best practices, preferred languages, or hints/tips that someone could give?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
首先,我认为最重要的一点是,您应该不惜一切代价避免编写任何将数据从生产服务器传输到另一个生产服务器的脚本。仅使用实时数据进行迁移的潜在危险就应该是足够的理由。
也就是说,我可以建议一些我认为可能适合此类任务的做法。
我确信还有许多其他事情需要注意,但我希望这些内容足以帮助您从一种模式过渡到另一种模式。如果本地服务器可行,然后导入,请不要让自己陷入远程服务器的困境。保持简单!
To start, I think the most important take-away is that you should at all costs avoid writing up any scripts that will transfer your data from a production server to another. The potential dangers of working with your live data for a migration alone should be enough reason.
That said, I can advise a few practices I think might be good for such a task.
I'm sure there are a number of other things to be looking out for, but I hope these are sufficiently helpful in helping you move from one schema to the next. Don't bog yourself down with remote servers if a local one is plausible and then import. Keep it simple!