使用 Google App Engine 进行网站负载测试

发布于 2024-10-19 16:14:34 字数 629 浏览 6 评论 0原文

Azure、Amazon 和其他基于实例的云提供商可用于执行网站负载测试(通过启动大量实例来运行向一组 URL 发送请求的程序),我想知道我是否能够使用 Google App 来执行此操作引擎。

但到目前为止,情况似乎并非如此。目前我能想到的唯一实现是设置每个以最高频率执行的 cron 作业的最大数量,每个任务请求一堆 URL,同时在 任务队列

根据我的计算,这仅足以触发最多 25 个并发请求(因为应用程序可以有 最多 20 个 cron 任务,每个任务的执行频率不超过一分钟一次,默认队列的吞吐率为 每秒 5 次任务调用

是否有办法让更多并发请求以自动方式获取 URL?有什么想法吗?

Azure, Amazon and other instance based cloud providers can be used to carry out website load tests (by spinning up numerous instances running programs that send requests to a set of URLs) and I was wondering if I would be able to do this with Google App Engine.

So far, however it seems this is not the case. The only implementation I can think of at the moment is setting up the maximum number of cron jobs each executing at the highest frequency, each task requesting a bunch of URLs and at the same time popping in further tasks in the task queue.

According to my calculations this is only enough to fire off a maximum of 25 concurrent requests (as an application can have maximum 20 cron tasks each executing no more frequent than once a minute and the default queue has a throughput rate of 5 task invocations per second.

Any ideas if there is a way I could have more concurrent requests fetching URLs in an automated way?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

东北女汉子 2024-10-26 16:14:34

taskqueue API 允许 <每个队列每秒强 > 100 次任务调用,最大活动队列配额如下:

免费: 10 个活动队列(不包括默认队列)

计费: 100活动队列(不包括默认队列)

每个任务使用一个 UrlFetch,乘以[活动队列的最大数量] * [每秒调用的最大任务数量] * [60 秒] 您可以达到这些标称 Urlfetch 调用率:

免费:
11 * 100 * 60 = 66000 Urlfetch 调用/分钟

计费:
101 * 100 * 60 = 606000 Urlfetch 调用/分钟

这些速率受到 每分钟允许的 UrlFetch 配额数量

免费:
3,000 次调用/分钟

计费: 32,000 次调用/分钟

如您所见,Taskqueue + Urlfetch API 可以有效地使用来满足您的负载测试需求。

The taskqueue API allows 100 task invocations per second per queue with the following max active queues quota:

Free: 10 active queues (not including the default queue)

Billing: 100 active queues (not including the default queue)

With a single UrlFetch per task, multiplying [max number of active queues] * [max number of tasks invocation per second] * [60 seconds] you can reach these nominal Urlfetch calls rate:

Free:
11 * 100 * 60 = 66000 Urlfetch calls/minute

Billing:
101 * 100 * 60 = 606000 Urlfetch calls/minute

These rates are limited by the number of allowed UrlFetch per minute quota:

Free:
3,000 calls/minute

Billing: 32,000 calls/minute

As you can see, Taskqueue + Urlfetch APIs can be used effectively to suit your load testing need.

不寐倦长更 2024-10-26 16:14:34

针对公共 url 的负载测试可能不如将盒子直接连接到与目标服务器相同的交换机准确。有很多不可控的网络效应。

根据您的具体情况,我建议您借用几个桌面盒子并使用它们。任何半像样的机器应该能够每分钟生成 2-3 千个呼叫。

也就是说,这实际上取决于您希望实现的目标规模。

Load testing against a public url may not be as accurate as getting boxes attached directly to the same switch as your target server. There are so many uncontrollable network effects.

Depending on your exact circumstances I would recommend borrowing a few desktop boxes for the purpose and using them. Any half decent machine should be able to generate a 2-3 thousand calls a minute.

That said, it really depends on the target scale you wish to achieve.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文