sqlalchemy批量插入语句在Postgres数据库中抛出attributeError
我正在尝试通过使用insert
语句将批量插入Python Sqlalchemy中的行。我需要使用插入语句而不是bulk_insert_mappings
,因为我想默默地忽略重复条目的失败插入。这以前并不明显,但我现在添加了。
该表是根据应有的。但是,即使是通过语句API进行的非常简单的插入操作也会引发此错误:
AttributeError: '_NoResultMetaData' object has no attribute '_indexes_for_keys'
最小可验证的示例:
import os
import sqlalchemy
from sqlalchemy import (
Column,
INTEGER,
TEXT
)
from sqlalchemy.dialects.postgresql import insert
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
Base = declarative_base()
class Test(Base):
__tablename__ = 'test'
id = Column(INTEGER, primary_key=True)
data = Column(TEXT)
engine = sqlalchemy.create_engine(os.environ['DATABASE_CONNECTION'])
Session = sessionmaker(engine)
Base.metadata.create_all(engine, Base.metadata.tables.values(), checkfirst=True)
connection = engine.connect()
buffer = [
{
'data': "First test"
},
{
'data': "Second test"
}
]
insert_statement = insert(Test).values(buffer)
# Using insert statement instead of bulk_insert_mappings so I can do nothing when adding duplicate entries
insert_or_do_nothing = insert_statement.on_conflict_do_nothing(index_elements=[Company.local_id])
orm_statement = sqlalchemy.select(Test).from_statement(insert_or_do_nothing)
with Session() as session:
session.execute(orm_statement).scalars()
connection.close()
完整的stacktrace:
Traceback (most recent call last):
File "/project/path/test.py", line 41, in <module>
session.execute(orm_statement).scalars()
File "/venv/path/sqlalchemy/orm/session.py", line 1715, in execute
result = compile_state_cls.orm_setup_cursor_result(
File "/venv/path/sqlalchemy/orm/context.py", line 354, in orm_setup_cursor_result
return loading.instances(result, querycontext)
File "/venv/path/sqlalchemy/orm/loading.py", line 89, in instances
cursor.close()
File "/venv/path/sqlalchemy/util/langhelpers.py", line 70, in __exit__
compat.raise_(
File "/venv/path/sqlalchemy/util/compat.py", line 208, in raise_
raise exception
File "/venv/path/sqlalchemy/orm/loading.py", line 69, in instances
*[
File "/venv/path/sqlalchemy/orm/loading.py", line 70, in <listcomp>
query_entity.row_processor(context, cursor)
File "/venv/path/sqlalchemy/orm/context.py", line 2627, in row_processor
_instance = loading._instance_processor(
File "/venv/path/sqlalchemy/orm/loading.py", line 715, in _instance_processor
primary_key_getter = result._tuple_getter(pk_cols)
File "/venv/path/sqlalchemy/engine/result.py", line 934, in _tuple_getter
return self._metadata._row_as_tuple_getter(keys)
File "/venv/path/sqlalchemy/engine/result.py", line 106, in _row_as_tuple_getter
indexes = self._indexes_for_keys(keys)
AttributeError: '_NoResultMetaData' object has no attribute '_indexes_for_keys'
我是否滥用语句接口? ORM语句看起来不错:
INSERT INTO test (data) VALUES (:data_m0), (:data_m1)
我正在使用
- PostgreSQL 14.4
- psycopg2-binary 2.9.3
- sqlalchemy 1.4.39
I am trying to insert rows in Python SQLAlchemy by bulk into a Postgres database by using an insert
statement. I need to use the insert statement instead of bulk_insert_mappings
, as I want to silently ignore failed insertion of duplicate entries. This was not apparent before, but I have added it now.
The table is created as it should. However, even a very simple insert operation via statement API throws this error:
AttributeError: '_NoResultMetaData' object has no attribute '_indexes_for_keys'
Minimal Verifiable Example:
import os
import sqlalchemy
from sqlalchemy import (
Column,
INTEGER,
TEXT
)
from sqlalchemy.dialects.postgresql import insert
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
Base = declarative_base()
class Test(Base):
__tablename__ = 'test'
id = Column(INTEGER, primary_key=True)
data = Column(TEXT)
engine = sqlalchemy.create_engine(os.environ['DATABASE_CONNECTION'])
Session = sessionmaker(engine)
Base.metadata.create_all(engine, Base.metadata.tables.values(), checkfirst=True)
connection = engine.connect()
buffer = [
{
'data': "First test"
},
{
'data': "Second test"
}
]
insert_statement = insert(Test).values(buffer)
# Using insert statement instead of bulk_insert_mappings so I can do nothing when adding duplicate entries
insert_or_do_nothing = insert_statement.on_conflict_do_nothing(index_elements=[Company.local_id])
orm_statement = sqlalchemy.select(Test).from_statement(insert_or_do_nothing)
with Session() as session:
session.execute(orm_statement).scalars()
connection.close()
Full stacktrace:
Traceback (most recent call last):
File "/project/path/test.py", line 41, in <module>
session.execute(orm_statement).scalars()
File "/venv/path/sqlalchemy/orm/session.py", line 1715, in execute
result = compile_state_cls.orm_setup_cursor_result(
File "/venv/path/sqlalchemy/orm/context.py", line 354, in orm_setup_cursor_result
return loading.instances(result, querycontext)
File "/venv/path/sqlalchemy/orm/loading.py", line 89, in instances
cursor.close()
File "/venv/path/sqlalchemy/util/langhelpers.py", line 70, in __exit__
compat.raise_(
File "/venv/path/sqlalchemy/util/compat.py", line 208, in raise_
raise exception
File "/venv/path/sqlalchemy/orm/loading.py", line 69, in instances
*[
File "/venv/path/sqlalchemy/orm/loading.py", line 70, in <listcomp>
query_entity.row_processor(context, cursor)
File "/venv/path/sqlalchemy/orm/context.py", line 2627, in row_processor
_instance = loading._instance_processor(
File "/venv/path/sqlalchemy/orm/loading.py", line 715, in _instance_processor
primary_key_getter = result._tuple_getter(pk_cols)
File "/venv/path/sqlalchemy/engine/result.py", line 934, in _tuple_getter
return self._metadata._row_as_tuple_getter(keys)
File "/venv/path/sqlalchemy/engine/result.py", line 106, in _row_as_tuple_getter
indexes = self._indexes_for_keys(keys)
AttributeError: '_NoResultMetaData' object has no attribute '_indexes_for_keys'
Am I misusing the statement interface? The ORM statement looks fine:
INSERT INTO test (data) VALUES (:data_m0), (:data_m1)
I am using
- PostgreSQL 14.4
- psycopg2-binary 2.9.3
- SQLAlchemy 1.4.39
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
我找到了使用插入语句的解决方案:避免使用ORM语句。由于某种原因,使用普通语句似乎可以完成这项工作,而ORM则抛出
attributeError
。这令人困惑,如正式文档要求ORM语句:
但是,如果您省略了ORM语句部分,一切都很好
I found a solution that uses insert statement: Avoid using the ORM statements. For some reason, using plain statements seems to do the job, whilst ORM ones throw the
AttributeError
.This is confusing, as the official documentation calls for ORM statements:
But if you omit the ORM statement part, all is good
查看 docs 您可以尝试使用
> code>> code>> session.bulk_insert_mappings()
。Looking at the docs you could try to use
session.bulk_insert_mappings()
.