Python 3 的 WSGI 服务器 (PEP 3333)

发布于 2024-12-04 12:14:16 字数 112 浏览 2 评论 0原文

哪些 WSGI 服务器可用于 Python 3 和 PEP 3333

What WSGI servers are available for Python 3 and PEP 3333?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(7

无法言说的痛 2024-12-11 12:14:17

正如 Gabriel 所指出的,Apache/mod_wsgi 3.X 支持 Python 3。其他选项包括 CherryPy WSGI 服务器和 uWSGI

As pointed out by Gabriel, Apache/mod_wsgi 3.X supports Python 3. Other options are CherryPy WSGI server and uWSGI.

Waitress

Waitress 旨在成为一个生产质量的纯 Python WSGI 服务器,具有非常可接受的性能。除了 Python 标准库中的依赖项之外,它没有任何依赖项。它在 Python 2.6+ 和 Python 3.2 下的 Unix 和 Windows 上的 CPython 上运行。众所周知,它可以在 UNIX 上的 PyPy 1.6.0 上运行。它支持 HTTP/1.0 和 HTTP/1.1。

以下是他们网站上的一段引用,说明了他们编写此文的原因:

为什么?

在 Waitress 发布时,已经有许多纯 Python WSGI 服务器。为什么我们需要另一个?

Waitress 旨在对需要广泛平台支持的 Web 框架作者有用。它既不是最快也不是最奇特的 WSGI 服务器,但使用它有助于消除大量的文档负担(例如生产与部署、Windows 与 Unix、Python 3 与 Python 2、PyPy 与 CPython)以及由此产生的用户当前(2012 年左右)WSGI 服务器的不稳定平台支持造成了混乱。例如,gunicorn 很棒,但不能在 Windows 上运行。 Paste.httpserver 是完全可用的,但不能在 Python 3 下运行,并且没有专用的测试套件可以让做过 Python 3 移植的人知道它在移植完成后可以工作。 wsgiref 在大多数 Python 下都可以正常工作,但速度有点慢,不建议用于生产使用,因为它是单线程的,并且尚未经过安全问题审核。

在撰写本文时,一些现有的 WSGI 服务器已经声称拥有广泛的平台支持并拥有可用的测试套件。例如,CherryPy WSGI 服务器针对 Python 2 和 Python 3,它可以在 UNIX 或 Windows 上运行。然而,它并不是与其同名 Web 框架分开分发的,并且要求非 CherryPy Web 框架仅依赖 CherryPy Web 框架分发来获取其服务器组件是很尴尬的。 CherryPy 服务器的测试套件也依赖于 CherryPy Web 框架,因此即使我们将其服务器组件分叉到一个单独的发行版中,我们仍然需要回填其所有测试。不过,CherryPy 团队已经开始研究 Cheroot,这应该可以解决这个问题。

Waitress 是 zope.server 中存在的 WSGI 相关组件的一个分支。 zope.server 具有开箱即用的、与框架无关的测试覆盖率,并且在分叉期间添加了更多的覆盖率。 zope.server 自 2001 年左右以来就以某种形式存在,并且从那时起就开始投入生产,因此 Waitress 并不完全是“另一台”服务器,它更多的是对已知运行良好的旧服务器的重新打包。< /p>

Waitress

Waitress is meant to be a production-quality pure-Python WSGI server with very acceptable performance. It has no dependencies except ones which live in the Python standard library. It runs on CPython on Unix and Windows under Python 2.6+ and Python 3.2. It is also known to run on PyPy 1.6.0 on UNIX. It supports HTTP/1.0 and HTTP/1.1.

Here is a quote from their website on why they wrote it:

Why?

At the time of the release of Waitress, there are already many pure-Python WSGI servers. Why would we need another?

Waitress is meant to be useful to web framework authors who require broad platform support. It's neither the fastest nor the fanciest WSGI server available but using it helps eliminate the N-by-M documentation burden (e.g. production vs. deployment, Windows vs. Unix, Python 3 vs. Python 2, PyPy vs. CPython) and resulting user confusion imposed by spotty platform support of the current (2012-ish) crop of WSGI servers. For example, gunicorn is great, but doesn't run on Windows. paste.httpserver is perfectly serviceable, but doesn't run under Python 3 and has no dedicated tests suite that would allow someone who did a Python 3 port to know it worked after a port was completed. wsgiref works fine under most any Python, but it's a little slow and it's not recommended for production use as it's single-threaded and has not been audited for security issues.

At the time of this writing, some existing WSGI servers already claim wide platform support and have serviceable test suites. The CherryPy WSGI server, for example, targets Python 2 and Python 3 and it can run on UNIX or Windows. However, it is not distributed separately from its eponymous web framework, and requiring a non-CherryPy web framework to depend on the CherryPy web framework distribution simply for its server component is awkward. The test suite of the CherryPy server also depends on the CherryPy web framework, so even if we forked its server component into a separate distribution, we would have still needed to backfill for all of its tests. The CherryPy team has started work on Cheroot, which should solve this problem, however.

Waitress is a fork of the WSGI-related components which existed in zope.server. zope.server had passable framework-independent test coverage out of the box, and a good bit more coverage was added during the fork. zope.server has existed in one form or another since about 2001, and has seen production usage since then, so Waitress is not exactly "another" server, it's more a repackaging of an old one that was already known to work fairly well.

£噩梦荏苒 2024-12-11 12:14:17

显然,根据 PEP 3333,最新版本的 mod_wsgi (3.3) 与 Python 3 兼容:

http:// code.google.com/p/modwsgi/

“原始 WSGI 规范 (PEP 333) 仅支持 Python 2.X。
mod_wsgi 对 Python 3.X 的支持是基于猜测的
Python 3.X 的 WSGI 规范是什么样的。这
新的 WSGI 规范 (PEP 3333) 现在终于被接受并
尽管需要对 mod_wsgi 进行一些调整以使其更加
严格,如果您根据 PEP 编写 Python 3 WSGI 应用程序
第3333章,它在mod_wsgi上工作得很好。如果您愿意
如果要使用 Python 3.X 进行实验,您将需要使用 Python 3.1 或更高版本。”

Apparently, the latest version of mod_wsgi (3.3) is compatible with Python 3 according to PEP 3333:

http://code.google.com/p/modwsgi/

"The original WSGI specification (PEP 333) only supports Python 2.X.
There is support in mod_wsgi for Python 3.X which is based on guesses
as to what the WSGI specification would look like for Python 3.X. The
new WSGI specification (PEP 3333) has finally now been accepted and
although some tweaks need to be made to mod_wsgi to make it more
strict, if you write your Python 3 WSGI application according to PEP
3333, it will work perfectly fine on mod_wsgi. If you wish to
experiment with Python 3.X, you will need to use Python 3.1 or later."

陪你到最终 2024-12-11 12:14:17

我喜欢 Rocket Web 服务器,特别是因为它本身是用 python 编写的。它还具有用于创建其他类型服务器的 API。我能够以最少的努力将其调整为 XMLRPC 服务器。

I like the Rocket Web server, particular because it is itself written in python. It also has an API for creating other kinds of servers. I was able to adapt it into an XMLRPC server with a minimal amount of effort.

落叶缤纷 2024-12-11 12:14:17

wsgiref,它是标准库的一部分。

wsgiref, which is a part of the standard library.

假装爱人 2024-12-11 12:14:17

Phusion Passenger 从 3.0 版开始支持 PEP333。最初是 Ruby 应用服务器,现在正式支持 Python。其中功能包括:

  • 直接集成到 Web 服务器中(类似于 mod_wsgi 的工作方式),但在 Web 服务器外部运行所有应用程序进程。
  • Apache 和 Nginx 支持。
  • 事件内部 I/O 架构。
  • 多进程应用程序工作线程 I/O 架构。 Phusion Passenger 缓冲所有输入和输出,以保护应用程序免受慢速客户端的影响。
  • 具有实时刷新的输出缓冲。
  • 根据流量动态生成和停止工作进程。
  • 自动用户切换,便捷的安全功能。
  • 滚动重新开始。

Phusion Passenger 目前被纽约时报、AirBnB、皮克斯、赛门铁克等大型机构使用。

Phusion Passenger supports PEP333 since version 3.0. Originally a Ruby app server, it now officially supports Python. Amongst the features are:

  • Integrates directly into the web server (similar to how mod_wsgi works) but runs all app processes outside the web server.
  • Apache and Nginx support.
  • Evented internal I/O architecture.
  • Multiprocess application worker I/O architecture. Phusion Passenger buffers all input and output in order to protect applications from slow clients.
  • Output buffering with real-time flushing.
  • Dynamic spawning and stopping of worker processes based on traffic.
  • Automatic user switching, a convenient security feature.
  • Rolling restarts.

Phusion Passenger is currently used by large parties like New York Times, AirBnB, Pixar, Symantec, etc.

一江春梦 2024-12-11 12:14:17

bjoern:快速且超轻量级的 HTTP/1.1 WSGI 服务器

引用 自述文件

为什么很酷

bjoern 是目前最快、最小且最轻量级的 WSGI 服务器,具有

  • ~ 1000 行 C 代码
  • 内存占用 ~ 600KB
  • Python 2 和 Python 3 支持
  • 单线程,没有协程或其他垃圾
  • 可以绑定到 TCP 主机:端口地址和 Unix 套接字
  • HTTP/1.0 和 1.1 中的完整持久连接(“保持活动”)支持,包括对 HTTP/1.1 分块响应的支持

如果您查看网络上的 WSGI 服务器基准(例如 Python WSGI 服务器性能分析),性能提升确实令人震惊:

在此处输入图像描述

如何安装

$ pip install bjoern

先决条件

在构建 bjoern 之前,您需要安装 gcclibev 软件包:

  • Debian/Ubuntu

    $ sudo apt install build-essential
    $ sudo apt install libev-dev
    
  • RHEL/CentOS

    $ sudo yum groupinstall '开发工具'
    $ sudo yum 安装 libev-devel
    
  • Fedora

    $ dnf groupinstall '开发工具'
    $ sudo dnf 安装 libev-devel
    
  • MacOS

    从App Store安装XCode以获取gcc;对于 libev,从源代码构建或安装 vie Homebrew:

    $brew 安装 libev
    
  • Windows

    不幸的是,Windows 不受支持,因为 libev 不可用。

meinheld:高性能异步WSGI Web服务器(基于picoev)

引用官网 :

Meinheld 是一款符合 WSGI 的高性能 Web 服务器,它利用 greenlet 和 picoev 以轻量级方式实现异步网络 I/O。

bjoern 类似,meinheld 主要用 C 语言编写以提高速度,并围绕高性能事件库构建 (picoevlibev by bjoern)。因此,您还需要安装 GCC 才能构建 meinheld 的 C 扩展。可安装在 Linux、MacOS 和 FreeBSD 上。

安装

$ pip install meinheld

bjoern: Fast And Ultra-Lightweight HTTP/1.1 WSGI Server

Quoting the README:

Why It's Cool

bjoern is the fastest, smallest and most lightweight WSGI server out there, featuring

  • ~ 1000 lines of C code
  • Memory footprint ~ 600KB
  • Python 2 and Python 3 support
  • Single-threaded and without coroutines or other crap
  • Can bind to TCP host:port addresses and Unix sockets
  • Full persistent connection ("keep-alive") support in both HTTP/1.0 and 1.1, including support for HTTP/1.1 chunked responses

If you take a look at WSGI server benchmarks on the web (e.g. A Performance Analysis of Python WSGI Servers), the performance boost is indeed astounishing:

enter image description here

How to install

$ pip install bjoern

Prerequisites

You will need gcc and libev packages to be installed before building bjoern:

  • Debian/Ubuntu:

    $ sudo apt install build-essential
    $ sudo apt install libev-dev
    
  • RHEL/CentOS:

    $ sudo yum groupinstall 'Development Tools'
    $ sudo yum install libev-devel
    
  • Fedora:

    $ dnf groupinstall 'Development Tools'
    $ sudo dnf install libev-devel
    
  • MacOS:

    Install XCode from App Store to get gcc; for libev, either build from source or install vie Homebrew:

    $ brew install libev
    
  • Windows

    Unfortunately, Windows is not supported because libev is not available.

meinheld: a high performance asynchronous WSGI Web Server (based on picoev)

Quoting the official site:

Meinheld is a high-performance WSGI-compliant web server that takes advantage of greenlet and picoev to enable asynchronous network I/O in a light-weight manner.

Similar to bjoern, meinheld is mostly written in C for speed and is built around a high-performance event library (picoev vs libev by bjoern). Because of that, you'll also need to have GCC installed in order to build meinheld's C extension. Installable on Linux, MacOS and FreeBSD.

Installation

$ pip install meinheld
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文