如何扩展CherryPy?
我在网上搜索了如何扩展 CherryPy 服务器,但没有找到太多信息。我想知道是否有关于这个主题的指南。我们计划为面向消费者的应用程序运行两个 CherryPy 实例。后端缓存和静态文件缓存已经处理好了,我们只需要处理大量简单的GET请求即可。
- 我们如何扩展前端?
- 默认情况下,cherrypy server.thread_pool 是 10。当我将其增加到 50 或 100 并运行我的 对它进行负载测试,它似乎冻结了服务器。我发现的大多数资源都使用 30-50 之间的某个数字。
- 其他技术 扩展到数千名用户 同时?
谢谢!
I did a web search on how to scale CherryPy server and didn't find much information. I was wondering if there is a guideline on this subject. We are planning to run two CherryPy instances for a consumer facing application. The backend caching and the static files cachine are already handled, we just need to handle a large number of simple GET requests.
- how do we scale the front-end?
- By default cherrypy server.thread_pool is 10. When I increase it to 50 or 100 and run my
load testing against it and it seems to freeze the server. Most resources I found are using some number between 30-50. - Other techniques
for scaling to thousands of users at
the same time?
Thanks!
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您应该看看以下两个网站...
http://yougov.github.com/pycon /slides/
http://www.readmespot.com/question/f/100994/cherrypy--load-balancing--and-remote-failover
Here are two sites you should take a look at...
http://yougov.github.com/pycon/slides/
http://www.readmespot.com/question/f/100994/cherrypy--load-balancing--and-remote-failover