如何获取 python 依赖项/模块的版本,包括通过 subprocess.Popen 调用的外部应用程序?

发布于 2024-08-19 06:10:51 字数 835 浏览 3 评论 0原文

我有一组 python 脚本,经常在不同的机器上运行,这些脚本依赖于一些外部库以及通过 subprocess.Popen 生成的一些其他应用程序。

正如预期的那样,根据已安装模块和应用程序的版本,输出会有所不同。为了解决这个问题,我想跟踪运行时使用的版本。

为了做到这一点,我考虑了以下步骤:

  1. 使用 modulefinder 收集依赖项。
  2. 尝试调用module.__version__module.get_version()或其他常见的方法 存储每个收集的模块的版本信息的方法。
  3. 收集对 subprocess.Popen 的所有调用,并尝试通过使用不同参数(例如 -version、-v、-?、-h、...)解析输出来获取版本号。

步骤 2 和 3 可以通过使用特定于发行版的工具(在我的例子中是 Debian),例如 dpkg 获取已安装软件包的版本。缺点是它不仅变得操作系统而且还变得特定于发行版,但是我确实意识到最初的方法即使有效,也是极其低效且容易出错的。

所以我的问题是是否有任何方案可以解决这个问题,或者是否有人对如何实施它有更好的建议?

I have a set of python scripts that I run frequently on different machines that depend on a few external libraries as well as some other applications spawned via subprocess.Popen.

As expected depending on the version of the installed modules and applications the output varies. To address this I would like to keep track of which versions were in use at runtime.

In order to do this I've considered the following steps:

  1. Using modulefinder to collect dependencies.
  2. Try to call module.__version__, module.get_version() or other common
    ways to store version information on each collected module.
  3. Collect all calls to subprocess.Popen and try to get a version number by parsing the output with different arguments such as -version, -v, -?, -h, ...

Steps 2 and 3 could be greatly improved by the use of distribution specific (Debian in my case) tools such as dpkg to get versions of installed packages. The downside is that it becomes not only OS but also distribution specific, however I do realize that the initial approach is extremely inefficient and error-prone, if functional at all.

So my question is if there is any package out there to address this or if anyone has a better suggestion on how to implement it?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

倦话 2024-08-26 06:10:51

两年过去了,一些新工具开始发挥作用。

为了解决第 1 点和第 2 点,现在有 pip freeze,特别是与 virtualenv 结合使用时。

对于外部应用程序,我必须向系统上的包管理器(apt-file、dpkg、qlist、emerge,...取决于系统)提供资源,其中包括所有预期的陷阱。

我仍然想参考罗斯的回答,因为它产生了预期的结果。缺点(从我的角度来看)是它收集的依赖项比严格要求的要多得多。

Two years passed and some new tools came into play.

To take care of point 1 and 2 there is now pip freeze specially when coupled with virtualenv.

For external applications I had to resource to the package manager (apt-file, dpkg, qlist, emerge,... depending on the system) on the system which included all the expected pitfalls.

I would still like to make a reference to Ross' answer as it produced the desired result. The disadvantage (from my point of view) is that it collected a lot more dependencies than strictly required.

玩物 2024-08-26 06:10:51

如果您在 Linux/Unix 中运行,您可以使用 memoize.pymemoize.py 使用 strace,或“系统跟踪”,以监视程序生成的所有系统读写命令,然后将这些依赖项存储到文件中。 memoize.py 也可以用作库,因此您可以在程序内部使用它的函数并获取 python 数据结构中的依赖项。

Instead of directly investigating your dependencies in python, and if you are running in Linux/Unix, you can wrapper your program with memoize.py. memoize.py uses strace, or "system trace", to watch all system read and write commands that your program generates and then stores these dependencies to a file. memoize.py may also be use as a library, so you can use its functions internally within your program and get the dependencies in a python data structure.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文