适用于世界上非 Twitter 的低延迟 Web 服务器/负载均衡器
多年来,Apache httpd 一直让我受益匪浅,在我一直维护的传统自定义 LAMP 堆栈应用程序中坚如磐石且高性能(阅读:试图摆脱)
我的 LAMP 堆栈日子现在已经屈指可数了,我正在继续精彩的旅程多语言世界:
1) Scala REST framework on Jetty 8 (on the fence between Spray & Scalatra)
2) Load balancer/Static file server: Apache Httpd, Nginx, or ?
3) MySQL via ScalaQuery
4) Client-side: jQuery, Backbone, 320 & up or Twitter Bootstrap
选项#2 是这个问题的焦点。我看到的基准测试表明,Nginx、Lighthttpd、G-WAN(特别是)和朋友在性能方面击败了 Apache,但这种领先似乎在 Web 服务器处理许多同时连接的高负载场景中表现得更加明显。鉴于我们的服务器每月最大带宽为 100GB,平均负载约为 0.10,因此高负载场景显然不起作用。
基本上,我需要与应用程序服务器(Jetty)的连接和 Web 服务器的静态文件传输既可靠又快速。最后,Web 服务器应该兼任应用程序服务器的负载平衡器(不需要 SSL,服务器位于 ASA 后面)。我不确定 Apache Httpd 与其他替代方案相比有多快,但它是经过验证、经过公路战士测试的软件。
那么,如果我使用 Nginx 或其他 Apache 替代方案,在可见性能方面会有什么不同吗?我认为不是,但为了实现近乎即时的页面加载,将问题提出来;-)
Apache httpd has done me well over the years, just rock solid and highly performant in a legacy custom LAMP stack application I've been maintaining (read: trying to escape from)
My LAMP stack days are now numbered and am moving on to the wonderful world of polyglot:
1) Scala REST framework on Jetty 8 (on the fence between Spray & Scalatra)
2) Load balancer/Static file server: Apache Httpd, Nginx, or ?
3) MySQL via ScalaQuery
4) Client-side: jQuery, Backbone, 320 & up or Twitter Bootstrap
Option #2 is the focus of this question. The benchmarks I have seen indicate that Nginx, Lighthttpd, G-WAN (in particular) and friends blow away Apache in terms of performance, but this blowing away appears to manifest more in high-load scenarios where the web server is handling many simultaneous connections. Given that our server does max 100gb bandwidth per month and average load is around 0.10, the high-load scenario is clearly not at play.
Basically I need the connection to the application server (Jetty) and static file delivery by the web server to be both reliable and fast. Finally, the web server should double duty as a load balancer for the application server (SSL not required, server lives behind an ASA). I am not sure how fast Apache Httpd is compared to the alternatives, but it's proven, road warrior tested software.
So, if I roll with Nginx or other Apache alternative, will there be any difference whatsoever in terms of visible performance? I assume not, but in the interest of achieving near instant page loads, putting the question out there ;-)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
是的,主要是在延迟方面。
根据谷歌(可能对延迟有所了解)的说法,延迟对于用户体验、高搜索引擎排名以及在高负载下生存(成功、脚本小子、真实攻击等)都很重要。
但扩展多核和/或使用更少的 RAM 和 CPU 资源不会有什么坏处 - 这就是这些 Web 服务器替代方案的目的。
基准测试显示,即使客户端数量较少,某些服务器也比其他服务器更快:这里是Apache的对比2.4、Nginx、Lighttpd、Varnish、Litespeed、Cherokee 和 G-WAN。
由于此测试是由独立于这些服务器作者的人员进行的,因此这些测试(使用虚拟化和 1、2、4、8 个 CPU 核心进行)具有明显的价值。
Yes, mostly in terms of latency.
According to Google (who might know a thing or tow about latency), latency is important both for the user experience, high search-engine rankings, and to survive high loads (success, script kiddies, real attacks, etc.).
But scaling on multicore and/or using less RAM and CPU resources cannot hurt - and that's the purpose of these Web server alternatives.
The benchmarks show that even at low numbers of clients, some servers are faster than others: here are compared Apache 2.4, Nginx, Lighttpd, Varnish, Litespeed, Cherokee and G-WAN.
Since this test has been made by someone independent from the authors of those servers, these tests (made with virtualization and 1,2,4,8 CPU Cores) have clear value.
将会有巨大的差异。对于任何超过零并发用户的情况,Nginx 都会与 Apache 擦肩而过。假设您正确配置了一切。查看以下链接以获得深入了解的一些帮助。
http://wiki.nginx.org/Main
http://michael.lustfield.net/content/dummies-guide-nginx
http://blog.martinfjordvald.com/2010/07/nginx-primer/
您将看到每秒请求数方面的改进,但您也会看到 RAM 和 CPU 使用率显着减少。我喜欢的一件事是通过更简单的配置更好地控制正在发生的事情。
Apache 声称 apache 2.4 将提供与 nginx 一样好甚至更好的性能。他们做出了一个大胆的声明,呼吁 nginx,当他们发布这个版本时,这有点像他们的屁股。当然,它们更接近,但 nginx 在几乎每一个基准测试中仍然表现出色。
There will be a massive difference. Nginx wipes the floor with Apache for anything over zero concurrent users. That's assuming you properly configure everything. Check out the following links for some help diving into it.
http://wiki.nginx.org/Main
http://michael.lustfield.net/content/dummies-guide-nginx
http://blog.martinfjordvald.com/2010/07/nginx-primer/
You'll see improvements in terms of requests/second but you'll also see significantly less RAM and CPU usage. One thing I like is the greater control over what's going on with a more simple configuration.
Apache made a claim that apache 2.4 will offer performance as good or better than nginx. They made a bold claim calling out nginx and when they made that release it kinda bit them in the ass. They're closer, sure, but nginx still wipes the floor in almost every single benchmark.