对使用 Celery 感兴趣的相对较新的程序员,这是正确的方法吗

发布于 2024-12-17 14:39:44 字数 387 浏览 4 评论 0原文

本质上,我有一个大型交易数据库,我正在编写一个脚本,该脚本将获取一些个人信息并将一个人与他们过去的所有交易相匹配。

因此,我为脚本提供一个名称,它会返回它确定属于该客户的所有交易。

问题是我必须为近 3 万人执行此操作,并且数据库有超过 600 万条交易记录。

在一台计算机上运行它显然需要很长时间,我愿意承认代码可以优化,但我没有时间这样做,而是想将工作分配到多台计算机上。输入 Celery:

我对 celery 的理解是,我将让一台老板计算机将名称发送到一台运行脚本的工作计算机,并将客户 ID 放入其匹配的每笔交易的列中。

多台工作计算机搜索并写入同一数据库是否会出现问题?

另外,我错过了什么和/或者这是完全错误的方法吗?

感谢您的帮助。

Essentially I have a large database of transactions and I am writing a script that will take some personal information and match a person to all of their past transactions.

So I feed the script a name and it returns all of the transactions that it has decided belong to that customer.

The issue is that I have to do this for almost 30k people and the database has over 6 million transaction records.

Running this on one computer would obviously take a long time, I am willing to admit that the code could be optimized but I do not have time for that and I instead want to split the work over several computers. Enter Celery:

My understanding of celery is that I will have a boss computer sending names to a worker computer which runs the script and puts the customer id in a column for each transaction it matches.

Would there be a problem with multiple worker computers searching and writing to the same database?

Also, have I missed anything and/or is this totally the wrong approach?

Thanks for the help.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

你穿错了嫁妆 2024-12-24 14:39:44

不,多台工作计算机搜索和写入同一个数据库不会有任何问题,因为 MySQL 被设计为能够处理这个问题。你的做法很好。

No, there wouldn't be any problem multiple worker computers searching and writing to the same database since MySQL is designed to be able to handle this. Your approach is good.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文