Python 与 MySQL 的多处理
我已经在 Google 上搜索过这个主题很多次,但找不到适合我需求的解决方案:(
我有一个 MySQL 数据库,其中包含一个包含电子邮件地址(10,000+)的表。
我想每 5 对它们运行一个批处理作业所以
我猜Python是从MySQL检索结果集然后使用电子邮件地址作为参数调用命令行的最佳选择
。整个结果集从 MySQL 开始,然后有一堆工作用参数调用命令行,直到不再有电子邮件地址。这可以通过简单但稳定的方式完成吗?
I've Googled this topic a lot, but cannot find a solution which fits my needs :(
I have a MySQL DB with a table containing e-mail adresses (10,000+).
I would like to run a batch job on them every 5 minute.
So I'll guess Python is a good choice for retrieving the resultset from MySQL and then call a command-line with the e-mail address' as arguments.
How do I do this the best way? I'm think of getting the entire resultset from MySQL and then have a bunch of workes calling the command-line with the arguments until there aren't anymore e-mail address. Can this be done in a simple, yet stable, way?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
您可以像这样使用多处理模块:
you could use the multiprocessing module like this :
使用 ORM 模块的替代方法是,您可以将电子邮件转储到 CSV 文件中:
来自:从命令行将 mysql 数据库转储到纯文本 (CSV) 备份
并在 python 中对 CSV 文件进行后处理:
CSV 文件读取和写入: http://docs.python.org/library/csv.html
当您处理非常大的文件时,您可能需要使用 file.readlines() 函数来防止 Python 将整个文件读入内存:
An alternative to using an ORM module you could dump the emails in a CSV file:
From: Dump a mysql database to a plaintext (CSV) backup from the command line
And post process the CSV file in python:
CSV File Reading and Writing: http://docs.python.org/library/csv.html
When your dealing with very large files you might want to use the file.readlines() function to prevent Python from reading the whole file into memory: