使用 Python cx_Oracle 将 BLOB 写入 Oracle DB 非常慢

发布于 2025-01-12 11:52:43 字数 1146 浏览 0 评论 0原文

我有一个非常简单的 Python 应用程序(使用 cx_Oracle),它将 BLOB 写入 Oracle 数据库,但写入大文件时速度非常慢。我的应用程序首先使用 SOAP API 从另一个系统下载图像文件,下载过程只需一两秒。然而,将 100k 大小的文件写入数据库需要一两秒,但 1.5MB 的文件只需要 1 分钟多一点。

我的 SQL 是:-

INSERT INTO my_data (doc_id,doc_content) VALUES (:blobid, :blobdata), blobid=id_variable, blobdata=blob_variable

花费这么长时间有什么原因吗?我可以加快速度吗?我想这应该不会超过几秒钟吧?

(免责声明!我以前从未使用过 cx_Oracle 或 Oracle 本身,我正在将性能与我用于此类事情的 Postgres 进行比较,后者的性能要好得多。)

完整的代码块在这里:-

def upload_docs(doc,blob,conn):
    sql="INSERT INTO my_data (doc_id,doc_content) VALUES (:dociddata, :blobdata)"
    cursor = conn.cursor()
    cursor.execute(sql, dociddata=doc, blobdata=blob)
    conn.commit()
    cursor.close()

conn = cx_Oracle.connect(user="Myuser", password="MyPassword",dsn="ocm.server.here.com:1527/some_name",encoding="UTF-8")
doc_csv = "/tmp/document_list.csv"
csv_file=open(doc_csv, 'r')

for line in csv_file:
splitLineArray = line.split(',')
documentId = splitLineArray[17]

#Pull document down from SOAP API
    documentData = (client.service.getDocument(int(documentId)))        
    upload_docs(documentId, documentData, conn)

I have a very simple Python application (using cx_Oracle) that is writing a BLOB to an Oracle database, but it is painfully slow when writing large files. My application first downloads an image file from another system using a a SOAP API, and it downloads in a matter of a second or two. However, writing files to the DB of say 100k in size, takes a second or two, but a file of 1.5MB takes just over 1 minute.

My SQL is :-

INSERT INTO my_data (doc_id,doc_content) VALUES (:blobid, :blobdata), blobid=id_variable, blobdata=blob_variable

Is there a reason it's taking so long? Can I speed it up? It oughtn't take more than a few seconds I'd have thought?

(Disclaimer! I have never used cx_Oracle or indeed Oracle itself ever before, I am comparing performance to Postgres which I have used for this kind of thing, and whose performance is infinitely better.)

The full code block is here :-

def upload_docs(doc,blob,conn):
    sql="INSERT INTO my_data (doc_id,doc_content) VALUES (:dociddata, :blobdata)"
    cursor = conn.cursor()
    cursor.execute(sql, dociddata=doc, blobdata=blob)
    conn.commit()
    cursor.close()

conn = cx_Oracle.connect(user="Myuser", password="MyPassword",dsn="ocm.server.here.com:1527/some_name",encoding="UTF-8")
doc_csv = "/tmp/document_list.csv"
csv_file=open(doc_csv, 'r')

for line in csv_file:
splitLineArray = line.split(',')
documentId = splitLineArray[17]

#Pull document down from SOAP API
    documentData = (client.service.getDocument(int(documentId)))        
    upload_docs(documentId, documentData, conn)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

指尖上的星空 2025-01-19 11:52:43

虽然我还没有完全解决这个问题,但我现在可以确认问题不是 cx_Oracle 的问题,也不是数据库本身的问题。我正在(或者更确切地说,曾经)通过 Oracle Connection Manager 连接到数据库,这导致了延迟。当我直接连接到数据库时,我可以在一秒钟左右写入大的 blob。所以问题肯定出在 OCM 上。

Although I haven't quite resolved this, I can now confirm the issue is not with cx_Oracle nor is it an issue with the database itself. I am (or rather, was) connecting to the DB through Oracle Connection Manager, which was causing the delays. When I connect direct to the DB I can write large blobs in around a second. so the issue is definitely with OCM.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文