单个服务器每秒处理 2000 个 HTTP 请求是否现实?
我正在构建一个基于 Java 的 Web 服务(使用 JSON 作为数据编码),该服务每秒需要处理多达 2,000 个 HTTP 请求。每个请求所需的处理几乎可以忽略不计(HashMap.put() 方法调用),解析 JSON 可能是主要开销。
我想知道单个高内存四倍超大 EC2 实例(68GB RAM、8 核、64 位)是否能够每秒处理多达 2,000 个 HTTP 请求?
我意识到要给出确切的答案很困难,我只是想知道这是否在可能性范围内,或者我是否在吸可卡因。
我目前正在使用 SimpleWeb Web 框架,尽管我注意到它似乎没有得到维护现在。人们能否推荐非常适合这种高容量使用的替代嵌入式 HTTP 服务器?
I am building a Java-based web service (using JSON as a data encoding) that will need to handle up-to 2,000 HTTP requests per second. The processing required for each request is almost negligible (a HashMap.put() method call), parsing the JSON would probably be the dominant overhead.
I am wondering whether a single High-Memory Quadruple Extra Large EC2 instance (68GB RAM, 8 cores, 64-bit) would be capable of handing as much as 2,000 HTTP requests per second?
I realize that an exact answer will be difficult, I'm just wondering whether this is within the bounds of possibility, or whether I'm smoking crack.
I'm currently using the SimpleWeb web framework, although I've noticed that it doesn't seem to be maintained currently. Can people recommend alternative embeddable HTTP servers that would be well suited to this kind of high-volume usage?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
每秒 2000 个请求(或 2 krps)应该完全在 Java servlet 的可能性范围内,只要您没有引入巨大的瓶颈并且您使用的框架不会太糟糕。鉴于您显然没有访问任何后端,该任务应该受 CPU 限制并且可以很好地扩展。
Web 框架的 JSON 序列化 测试Benchmarks 显示了许多 Java 框架,它们给出了非常好的结果;即使使用20个数据库查询结果仍然远远超过 2 krps。在 Amazon 上,他们使用 m1.large 实例比您计划使用的小(c3.4xlarge,我猜)。
您可以尝试 Undertow,它提供了方便的 servlet API,并且 维护良好。 Netty 是另一种可能性,尽管它有自己的 API。
注意:我意识到这个问题有点老了,但问题应该仍然有效。
2000 requests per second (or 2 krps) should be well within the realm of possibility for a Java servlet, provided that you don't introduce huge bottlenecks and that the framework you are using doesn't suck too much. Given that apparently you are not accessing any backends, the task should be CPU-bound and scale very well.
The JSON serialization test of the Web Framework Benchmarks shows a lot of Java frameworks that give very good results; even with 20 database queries results are still very well over 2 krps. On Amazon they are using m1.large instances which are smaller than the ones you plan to use (c3.4xlarge, I gather).
You might try Undertow which provides a convenient servlet API and is well maintained. Netty is another possibility, although it has its own API.
Note: I realize that the question is a bit old, but the problem should still be valid.
这绝对是可能的,根据这个问题,< a href="http://netty.io/" rel="nofollow noreferrer">Netty 每秒可以处理超过 100 000 次交互。一些 JSON 解析器可以将请求字符串转换为 JSON 对象,或者您甚至可以使用它的二进制变体,BSON,如此处所述(如果消息很长或非常复杂)。来自这个问题< /a> 看起来具有最新操作系统的服务器可以处理的连接数超过 300 000,远远超过您的任务所需的连接数。
当然,这也取决于您需要采取哪些操作来处理可能成为限制因素的请求。
This is definitely possible, in accordance to this question, Netty can handle way over 100 000 interactions per second. Some JSON parser can convert request string into JSON object or maybe you can even use a binaray variant of it, BSON, as describe here (if the messages are long or very complex). From this question looks like the number of connections that a server with recent operating system can handle is over 300 000 so much more than would be needed for your task.
Of course, this also depends on which actions do you need to take to handle the request that may be a limiting factor.