如何在Python多处理中共享DB连接池?
我正在使用多处理。pool执行一些功能。在功能中,我需要连接到数据库(使用sqlalchemy)。我尝试通过使用Multiprocessing.queue,将Sqlalchemy连接池分享到子过程:
from multiprocessing import Pool, Manager
def process(data, queue):
db = queue.get()
with db.connect() as connection:
# execute some query
data_list = [] # list of data I'm going to deal with
pool = Pool(8)
manager = Manager()
queue = manager.Queue()
db = sqlalchemy.create_engine()
for data in data_list:
queue.put(db)
pool.apply_async(func=process, args=(data, db)) # This is 1st way I try
pool.apply_async(func=process, args=(data, queue)) # This is 2nd way I try
我尝试了这两种方式,但它们都会引起错误。
第一种方法将在用db.connect()作为连接执行时,引起
破裂的管道错误
。
第二种方法将使无法泡菜本地对象'create_engine。
我搜索了这个问题,发现有些人说与儿童流程共享连接池是可行的,但是我应该如何与多处理共享引擎呢?
I'm using multiprocessing.Pool to execute some function. And in the function I need to connect to database (using sqlalchemy). I try to share the sqlalchemy connection pool to child processes by using multiprocessing.Queue, like this:
from multiprocessing import Pool, Manager
def process(data, queue):
db = queue.get()
with db.connect() as connection:
# execute some query
data_list = [] # list of data I'm going to deal with
pool = Pool(8)
manager = Manager()
queue = manager.Queue()
db = sqlalchemy.create_engine()
for data in data_list:
queue.put(db)
pool.apply_async(func=process, args=(data, db)) # This is 1st way I try
pool.apply_async(func=process, args=(data, queue)) # This is 2nd way I try
I try these two way, but they both raise error.
The first way would raise a Broken Pipe Error
when executing with db.connect() as connection
.
The second way it would raise a Can't pickle local object 'create_engine.<locals>.connect'
error.
I search this problem and found some people said sharing connection pool to child processes is feasible, but how should I share the engine with multiprocess?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
如 @charchit的评论,此处是关于在多处理中使用sqlalchemy的文档
As @charchit's comment, here is documentation about using sqlalchemy with multiprocessing