从 Java 批量插入到 Oracle
我需要将许多小行快速插入 Oracle 中。 (5 个字段)。
对于 MySQL,我将插入分为 100 个插入组,然后对每组 100 个插入使用一个插入语句。
但对于 Oracle,用户反馈是批量插入(1000-30000 次)太慢。
是否有类似的技巧可以用来加快从 Java 到 Oracle 的编程插入速度?
I need to insert many small rows rapidly into Oracle. (5 fields).
With MySQL, I break the inserts into groups of 100, then use one insert statement for every group of 100 inserts.
But with Oracle, user feedback is that the mass inserts (anywhere from 1000-30000) are too slow.
Is there a similar trick I can use to speed up the programmatic inserts from Java into Oracle?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
您可以使用Spring的DAO模块批量插入多行。
在一次更新中将 Order 对象集合插入数据库的示例:
You can use Spring's DAO module to batch insert many rows.
An example that inserts a collection of Order objects into the database in one update:
您没有说明如何将这些记录传递到数据库。最好的方法是使用数组,因为这允许使用 Oracle 的大量漂亮的 FORALL 批量操作。
该示例包有两个过程。一种是填充 T23 记录的集合(一个包含五个数字列的表),另一种是使用数组将记录批量插入到该表中。
以下是一些示例运行的输出:
我认为在 0.15 秒内插入 100,000 条记录应该可以满足除了最苛刻的用户之外的所有用户的需求。那么,问题是,您如何处理插入内容?
You don't say how you are passing these records to the database. The best way is to use an array, as this allows the use of Oracle's plenty of nifty FORALL bulk operations.
This example package has two procedures. One which populates a collection of T23 records (a table which comprises five numeric columns) and one which bulk inserts records into that table using an array.
Here is the output from some sample runs:
I think inserting 100,000 records in 0.15 seconds should please all but the most demanding of users. So, the question is, how do you approach your inserts?
现在MySQL是Oracle,所以也许一个更简单的解决方案可能是留在MySQL上......
如果不是,那么你应该确保在开始插入组之前启动事务,一旦组完成,然后提交事务并启动一个下一组插入的新事务。
还要检查是否有不必要的索引定义,这些定义可能会减慢插入时间。
已更新...
批量插入指的是ETL(Extract Transform Load)的最后一步,所以您是否考虑过使用基于 Java 的 ETL 工具,例如 pentaho Kettle 或 talend-studio。
Pentaho 在此处描述了他们的 Oracle Bulk 加载工具。
快速谷歌一下还显示了一些初步证据,表明 Talend 也对 Oracle 批量加载有一些支持。
Nowadays MySQL is Oracle so perhaps a simpler solution might be to stay on MySQL...
If not then you should make sure that the transaction is started before you start your group of inserts, once the group is finished then commit the transaction and start a new transaction for the next group of inserts.
Also check for unnecessary index definitions that may be slowing the insert time.
Updated...
Bulk inserting refers to the last step of ETL (Extract Transform Load), so have you considered using a java based ETL tool such as pentaho kettle or talend-studio.
Pentaho describes their Oracle Bulk loading facilities here.
A quick google also shows some preliminary evidence that Talend also has some support for Oracle bulk loading.
尝试一下,
try it,