Python - 从可执行文件运行时,Multiprocessing.processes 成为主进程的副本

发布于 2024-11-29 19:28:59 字数 296 浏览 1 评论 0原文

我刚刚在我的程序中发现了一个与 Python 多处理模块的使用相关的奇怪错误。当我在我的机器上从源代码运行程序时,一切正常。但我一直使用 pyinstaller 将其构建为可执行文件,并且由于某种原因,当我运行从代码构建的可执行文件时,多处理的行为会发生巨大变化。具体来说,当我尝试运行代码的多处理部分,而不是执行预期的操作时,会弹出程序主窗口的副本,每个进程一个。更糟糕的是,如果手动关闭它们,它们会重新打开,大概是因为它们是 multiprocessing.pool 的一部分。不会打印任何错误消息,并且一旦创建,所有窗口就坐在那里什么都不做。可能发生什么原因导致这种情况?

I just discovered a bizarre bug in my program related to its use of Python's multiprocessing module. Everything works fine when I run the program from the source on my machine. But I've been building it into an executable using pyinstaller, and for some reason the behavior of multiprocessing changes drastically when I run the executable built from my code. Specifically, when I try to run the multiprocessing part of my code, rather than do what it's supposed to, what appears to be a copy of my program's main window pops up, one for each process. Even worse, they reopen if they are closed manually, presumably because they are part of a multiprocessing.pool. No error messages are printed, and once created all the windows just sit there doing nothing. What could be happening to cause this?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

挽清梦 2024-12-06 19:28:59

在 Windows 上,multiprocessing 尝试通过启动可执行文件的新实例来模拟 Unix fork() 系统调用,并执行其子进程例程 (multiprocessing.forking. main()) 中。使用标准 Python 解释器 (python.exe),multiprocessing 可以传递 -c 参数来运行自定义代码。但是,对于自定义可执行文件,这是不可能的,因为可执行文件很可能不支持与 python.exe 相同的命令行选项。

freeze_support() 函数通过显式执行子进程例程来回避此问题,并通过调用 sys.exit() 终止解释器。如果您忘记调用freeze_support(),新进程将不知道自己是子进程并运行主应用程序逻辑。在您的情况下,这将弹出另一个主 GUI 窗口。

由于从新创建的进程启动另一个子进程将导致无限递归,因此 multiprocessing 尝试通过检查 sys.frozen 属性并引发 RuntimeError< /code> 如果 freeze_support() 没有被调用。在您的情况下,似乎需要用户交互来生成进程,因此不存在无限递归,也没有 RuntimeError

按照约定,sys.frozen 仅针对由 py2exe 或 PyInstaller 创建的自动生成的可执行文件进行设置。当想要将 Python 嵌入到支持 Windows 下多处理的自定义可执行文件中时,理解这一逻辑并将 sys.frozen 设置为 True 非常重要。

On Windows, multiprocessing tries to emulate the Unix fork() system call by starting new instances of your executable, and execute its child process routine (multiprocessing.forking.main()) therein. With the standard Python interpreter (python.exe), multiprocessing can pass the -c parameter to run custom code. For custom executables, however, this is not be possible since the executable will most probably not support the same command line options as python.exe.

The freeze_support() function sidesteps this problem by executing the child process routine explicitely, and terminate the interpreter by calling sys.exit(). If you forget to call freeze_support(), the new process does not know that it is a child process and runs the main application logic. In your case, this will pop up another main GUI window.

Since starting yet another child process from the newly created process will cause infinite recursion, multiprocessing tries to prevent this by checking the sys.frozen attribute and raise a RuntimeError if freeze_support() was not called. In your case, it seems that user interaction is required to spawn the processes, therefore there is no infinite recursion and no RuntimeError.

By convention, sys.frozen is only set for automatically generated executables as created by py2exe or PyInstaller. It is important to understand this logic and set sys.frozen to True when one wants to embed Python in a custom executable that should support multiprocessing under windows.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文