将大量数据加载到 Oracle SQL 数据库

发布于 2024-09-04 19:30:55 字数 328 浏览 6 评论 0原文

我想知道是否有人对我即将开始的事情有任何经验。我有几个 csv 文件,大小都在 GB 左右,我需要将它们加载到 Oracle 数据库中。虽然加载后我的大部分工作都是只读的,但我需要不时加载更新。基本上我只需要一个好的工具来一次将几行数据加载到我的数据库中。

以下是我到目前为止发现的内容:

  1. 我可以使用 SQL Loader 完成很多工作

  2. 我可以使用批量插入命令

  3. 某种批量插入。

以某种方式使用准备好的语句可能是一个好主意。我想我想知道每个人都认为完成此插入的最快方法是什么。有什么建议吗?

I was wondering if anyone had any experience with what I am about to embark on. I have several csv files which are all around a GB or so in size and I need to load them into a an oracle database. While most of my work after loading will be read-only I will need to load updates from time to time. Basically I just need a good tool for loading several rows of data at a time up to my db.

Here is what I have found so far:

  1. I could use SQL Loader t do a lot of the work

  2. I could use Bulk-Insert commands

  3. Some sort of batch insert.

Using prepared statement somehow might be a good idea. I guess I was wondering what everyone thinks is the fastest way to get this insert done. Any tips?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

婴鹅 2024-09-11 19:30:55

如果您可以推出自己的实用程序,其性能将优于 SQL*Loader 直接路径加载。 Oracle 正是出于这个目的构建了这个实用程序 - 构建更高效的东西的可能性几乎为零。还有并行直接路径加载,它允许您同时运行多个直接路径加载进程。

从手册中:

而不是填充绑定数组缓冲区
并将其传递给 Oracle 数据库
使用 SQL INSERT 语句,直接
路径加载使用直接路径API
将要加载的数据传递给load
服务器中的引擎。负载引擎
构建一个列数组结构
传递给它的数据。

直接路径加载引擎使用
要格式化的列数组结构
Oracle数据块和建立索引
键。新格式化的数据库
块直接写入
数据库(每个 I/O 多个块
使用异步写入的请求 if
主机平台支持
异步 I/O)。

内部使用多个缓冲区
对于格式化块。虽然一
缓冲区正在被填充,一个或多个
正在写入缓冲区,如果
异步 I/O 可用
主机平台。重叠计算
通过 I/O 提高负载性能。

在某些情况下,无法使用直接路径加载 /a>.

I would be very surprised if you could roll your own utility that will outperform SQL*Loader Direct Path Loads. Oracle built this utility for exactly this purpose - the likelihood of building something more efficient is practically nil. There is also the Parallel Direct Path Load, which allows you to have multiple direct path load processes running concurrently.

From the manual:

Instead of filling a bind array buffer
and passing it to the Oracle database
with a SQL INSERT statement, a direct
path load uses the direct path API to
pass the data to be loaded to the load
engine in the server. The load engine
builds a column array structure from
the data passed to it.

The direct path load engine uses the
column array structure to format
Oracle data blocks and build index
keys. The newly formatted database
blocks are written directly to the
database (multiple blocks per I/O
request using asynchronous writes if
the host platform supports
asynchronous I/O).

Internally, multiple buffers are used
for the formatted blocks. While one
buffer is being filled, one or more
buffers are being written if
asynchronous I/O is available on the
host platform. Overlapping computation
with I/O increases load performance.

There are cases where Direct Path Load cannot be used.

羁客 2024-09-11 19:30:55

有了这么多的数据,您最好确定您的后备存储 - dbf 磁盘的可用空间。

sqlldr是脚本驱动,非常高效,一般比sql脚本效率更高。
我唯一想知道的是数据的大小。我个人会考虑多个到多个 sqlldr 进程,并为每个进程分配一个数据子集,并让这些进程并行运行。

你说你想一次加载几条记录?这可能需要比你想象的更长的时间。您的意思是一次几个文件吗?

With that amount of data, you'd better be sure of your backing store - the dbf disks' free space.

sqlldr is script drive, very efficient, generally more efficient than a sql script.
The only thing I wonder about is the magnitude of the data. I personally would consider several to many sqlldr processes and assign each one a subset of data and let the processes run in parallel.

You said you wanted to load a few records at a time? That may take a lot longer than you think. Did you mean a few files at a time?

伏妖词 2024-09-11 19:30:55

您可以在 CSV 文件上创建外部表,并通过从外部表中选择到另一个表来将其加载。不确定此方法是否会更快,但在混乱让 sql*loader 工作方面可能会更快,特别是当您有更新条件时。

You may be able to create an external table on the CSV files and load them in by SELECTing from the external table into another table. Whether this method will be quicker not sure however might be quicker in terms of messing around getting sql*loader to work especially when you have a criteria for UPDATEs.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文