Python加速器
我打算使用Python 来开发一个Web 应用程序。有人知道 python 加速器吗? (例如用于 php 的 eAccelerator 或 apc)如果没有,是否有任何方法可以缓存预编译的 python 字节码? 关于 python 和 php 之间的性能比较的任何想法(假设数据库/网络延迟相同)
提前致谢。
I'm planning to use Python to develop a web application. Anybody has any idea about any accelerator for python? (something like eAccelerator or apc for php) if not, is there any way to cache the pre-compiled python bytecode ?
Any idea about the performance comparison between python and php (assuming db/network latencies are same)
Thanks in advance.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
这有一个技巧。
它称为
mod_wsgi
。它的本质是这样运作的。
对于“静态”内容(.css、.js、图像等),请将它们放在一个目录中,以便由 Apache 提供服务,而您的 Python 程序不知道它们已被发送。
对于“动态”内容(主要 HTML 页面本身),您可以使用 mod_wsgi 来派生一个在 Apache 外部运行的“后端”进程。
这比 PHP 更快,因为现在有几件事情同时发生。 Apache 已将请求分派到后端进程,然后在后端仍在运行第一个请求时继续处理下一个请求。
此外,当您发送 HTML 页面后,后续请求将由 Apache 处理,而您的 Python 程序不知道或关心发生了什么。这会带来巨大的加速。与Python的速度无关。一切都与整体架构有关。
There's a trick to this.
It's called
mod_wsgi
.The essence of it works like this.
For "static" content (.css, .js, images, etc.) put them in a directory so they're served by Apache, without your Python program knowing they were sent.
For "dynamic" content (the main HTML page itself) you use
mod_wsgi
to fork a "back-end" process that runs outside of Apache.This is faster than PHP because now several things are going on at once. Apache has dispatched the request to a backend process and then moved on to handle the next request while the backend is still running the first one.
Also, when you've sent your HTML page, the follow-on requests are handled by Apache without your Python program knowing or caring what's going on. This leads to huge speedups. Nothing to do with the speed of Python. Everything to do with the overall architecture.
只要您在“主脚本”(您直接使用
python
调用并且获得__name__
为__main__
的“主脚本”中执行少量工作即可>) 你不必担心“缓存预编译的 python 字节码”:当你import foo
时,foo.py
会作为保存到磁盘(同一目录) >foo.pyc
,只要该目录可供您写入,那么已经很便宜的字节码编译就会发生一次,并且“永远”Python将加载foo.pyc
直接在每个执行import foo
的新进程中 - 在单个进程中,除了第一个之外的每个import foo
只是对字典的快速查找内存(sys.module
字典)。 Python 中的核心性能理念:确保每一位实质性代码都发生在模块中的def
语句中——不有任何 at 模块顶层,在主脚本中,或者特别是。在exec
和eval
语句/表达式中!-)。我没有 PHP 与 Python 的基准测试,但我注意到 Python 在每个新版本中都得到了相当明显的优化,所以如果你想看到“最快的版本”,请确保比较最近的版本(最好是 2.7,至少 2.6)。 Python”。如果您还觉得它不够快,
cython
(一种 Python 方言,旨在直接编译为 C,然后编译为机器代码,但有一些限制)是当今有选择地优化这些模块的最简单方法哪个分析表明您需要它。As long as you do trivial amounts of work in your "main script" (the one you directly invoke with
python
and which gets a__name__
of__main__
) you need not worry about "caching the pre-compiled python bytecode": when youimport foo
,foo.py
gets saved to disk (same directory) asfoo.pyc
, as long as that directory is writable by you, so the already-cheap compilation to bytecode happens once and "forever after" Python will loadfoo.pyc
directly in every new process that doesimport foo
-- within a single process, everyimport foo
except the first one is just a fast lookup into a dictionary in memory (thesys.module
dictionary). A core performance idea in Python: makes sure every bit of substantial code happens withindef
statements in modules -- don't have any at module top level, in the main script, or esp. withinexec
andeval
statements/expressions!-).I have no benchmarks for PHP vs Python, but I've noticed that Python keeps getting optimized pretty noticeably with every new release, so make sure you compare a recent release (idealy 2.7, at least 2.6) if you want to see "the fastes Python". If you don't find it fast enough yet,
cython
(a Python dialect designed to compile directly into C, and thence into machine code, with some limitations) is today the simplest way to selectively optimize those modules which profiling shows you need it.其他人提到了 Python 字节码文件,但这基本上是无关紧要的。这是因为 Python 的托管机制(CGI 除外)在请求之间将 Python Web 应用程序保留在内存中。这与 PHP 不同,PHP 有效地丢弃了请求之间的应用程序。因此,Python 不需要加速器,因为 Python Web 托管机制的工作方式避免了 PHP 所存在的问题。
Others have mentioned Python byte code files, but that is largely irrelevant. This is because hosting mechanisms for Python, with the exception of CGI, keep the Python web Application in memory between requests. This is different to PHP which effectively throws away the application between requests. As such, Python doesn't need an accelerator as the way Python web hosting mechanisms work avoids the problems that PHP has.
在我见过的每个环境中,编译后的 python 字节码都会自动缓存在 .pyc 文件中。据我所知,还需要做其他事情。
如果你想直接生成这些文件,你可以使用: http://docs.python.org/库/py_compile.html
The compiled python Bytecode is cached in .pyc files automatically in every environment i have seen. There is so need to do anything else as far as i know.
If you want to generate these files directly you can use: http://docs.python.org/library/py_compile.html