App Engine 停机时间

发布于 2024-09-11 00:29:44 字数 117 浏览 10 评论 0原文

Google 的 App Engine 是否有过多的停机时间,特别是在数据存储写入方面?

此外,停机时间似乎安排在高流量时段,例如,下午中间与凌晨 3:00。这是正常的吗?随着技术的成熟,它会有所改善吗?

Does Google's App Engine have excessive downtime, specifically with regards to datastore writes?

Additionally, downtime seems to be scheduled during high traffic times, e.g., in the middle of the afternoon vs. 3:00AM in the morning. Is this normal? Will it improve as the technology matures?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

只是偏爱你 2024-09-18 00:29:44

简短回答

  1. 下午与凌晨停机时间。下午数据存储不可用的写入频率比凌晨高出约 20-30%(太平洋时间;包括 put、更新和删除可用性)。

    注意:我确信 Google希望在非高峰时段发生停机。因此,我预计他们会继续尝试最大程度地减少停机时间,或尽可能将停机时间安排在非高峰时间。

  2. 停机趋势。数据存储不可用的 15 分钟时间段的数量一直在减少。在过去 366 天中,平均每天有 3.8 个 15 分钟的时间段数据存储不可用。在过去 200 天里,这一数字减少了 60% 至每天 2.3 次。过去几个月的写入停机时间实际上相当不错——自 3 月 1 日以来,每天的 15 分钟写入停机时间块少于 0.25 个。下面是数据存储写入停机时间的图表:
    停机趋势 http://imagebin.ca/img/4wkHVQPc.png


答案来源

为了回答您的问题,我写了< a href="http://gist.github.com/483827">此脚本,用于从 GAE 的数据存储状态页面


图表

从 2009 年 7 月 20 日到 2010 年 7 月 20 日期间数据存储写入停机时间(4 小时):

替代文本 http://imagebin.ca/img/p9ScWTm.png

从 2009 年 7 月 20 日到 2010 年 7 月 20 日期间数据存储写入停机时间(1 小时)

替代文本http://imagebin.ca/img/9FbLut2G.png

从 2009 年 7 月 20 日到 2010 年 7 月 20 日的数据存储停机时间(4 小时)

alt text http ://imagebin.ca/img/t3XKLk.png

2010 年 1 月 1 日到 2010 年 7 月 20 日的数据存储停机时间(4 小时)

替代文本 http://imagebin.ca/img/k36T9h.png


原始数据

(如果您想使用稍微不同的参数收集自己的数据,您可以调整脚本顶部的变量):

# RAW Data: Each element counts the number of days in which the datastore
# was unavailable for at least some portion of a given 15-minute window. The
# first element corresponds to the time chunk from 00:00 to 00:15, and so on.
RESULTS_SINCE_2010JAN01_BIN15 = [0, 0, 0, 0, 3, 11, 3, 3, 3, 3, 12, 3, 3, 3, 4, 14, 4, 4, 4, 4, 12, 2, 2, 2, 2, 14, 4, 4, 4, 4, 11, 2, 2, 2, 2, 11, 5, 5, 5, 5, 13, 4, 4, 4, 4, 14, 7, 5, 5, 5, 14, 4, 3, 3, 3, 13, 2, 2, 2, 2, 12, 5, 4, 4, 4, 14, 5, 3, 3, 3, 12, 7, 2, 2, 2, 5, 5, 0, 0, 0, 2, 9, 3, 2, 2, 2, 10, 1, 1, 1, 2, 9, 3, 3, 3, 15]
RESULTS_SINCE_2009JUL20_BIN15 = [0, 0, 0, 0, 11, 21, 5, 5, 5, 5, 29, 6, 6, 6, 7, 38, 11, 11, 11, 11, 37, 7, 7, 7, 7, 44, 12, 12, 12, 12, 37, 10, 10, 10, 10, 34, 7, 7, 7, 7, 46, 11, 11, 11, 11, 39, 15, 13, 13, 13, 44, 13, 12, 12, 12, 44, 5, 5, 5, 5, 34, 11, 10, 10, 10, 40, 13, 11, 11, 11, 31, 21, 12, 12, 11, 19, 21, 4, 4, 4, 13, 28, 10, 9, 9, 16, 36, 10, 10, 10, 12, 32, 7, 7, 6, 35]
RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN15 = [0, 0, 0, 0, 4, 12, 4, 4, 4, 4, 22, 6, 6, 6, 7, 27, 7, 7, 7, 7, 21, 6, 6, 6, 6, 32, 9, 9, 9, 9, 26, 8, 8, 8, 8, 27, 7, 7, 7, 7, 30, 7, 7, 7, 7, 27, 10, 8, 8, 8, 28, 10, 9, 9, 9, 28, 4, 4, 4, 4, 21, 4, 4, 4, 4, 25, 6, 4, 4, 4, 18, 14, 9, 10, 9, 16, 17, 2, 2, 2, 8, 18, 7, 6, 6, 9, 19, 5, 5, 5, 6, 18, 5, 5, 4, 21]

# RESULTS DISTILLED FROM COLLECTED_RESULTS
RESULTS_SINCE_2010JAN01_BIN60 = [RESULTS_SINCE_2010JAN01_BIN15[i*4]+RESULTS_SINCE_2010JAN01_BIN15[i*4+1]+RESULTS_SINCE_2010JAN01_BIN15[i*4+2]+RESULTS_SINCE_2010JAN01_BIN15[i*4+3] for i in xrange(24)]
RESULTS_SINCE_2010JAN01_BIN240 = [RESULTS_SINCE_2010JAN01_BIN60[i*4]+RESULTS_SINCE_2010JAN01_BIN60[i*4+1]+RESULTS_SINCE_2010JAN01_BIN60[i*4+2]+RESULTS_SINCE_2010JAN01_BIN60[i*4+3] for i in xrange(6)]
RESULTS_SINCE_2010JAN01_BIN480 = [RESULTS_SINCE_2010JAN01_BIN60[i*2]+RESULTS_SINCE_2010JAN01_BIN60[i*2+1] for i in xrange(3)]
RESULTS_SINCE_2009JUL20_BIN60 = [RESULTS_SINCE_2009JUL20_BIN15[i*4]+RESULTS_SINCE_2009JUL20_BIN15[i*4+1]+RESULTS_SINCE_2009JUL20_BIN15[i*4+2]+RESULTS_SINCE_2009JUL20_BIN15[i*4+3] for i in xrange(24)]
RESULTS_SINCE_2009JUL20_BIN240 = [RESULTS_SINCE_2009JUL20_BIN60[i*4]+RESULTS_SINCE_2009JUL20_BIN60[i*4+1]+RESULTS_SINCE_2009JUL20_BIN60[i*4+2]+RESULTS_SINCE_2009JUL20_BIN60[i*4+3] for i in xrange(6)]
RESULTS_SINCE_2009JUL20_BIN480 = [RESULTS_SINCE_2009JUL20_BIN240[i*2]+RESULTS_SINCE_2009JUL20_BIN240[i*2+1] for i in xrange(3)]
RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN60 = [RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN15[i*4]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN15[i*4+1]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN15[i*4+2]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN15[i*4+3] for i in xrange(24)]
RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN240 = [RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN60[i*4]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN60[i*4+1]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN60[i*4+2]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN60[i*4+3] for i in xrange(6)]
RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN480 = [RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN240[i*2]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN240[i*2+1] for i in xrange(3)]

Short Answers

  1. Afternoon vs early morning downtime. The datastore has been unavailable for writes about 20-30% more often in the afternoon than in the wee hours of the morning (Pacific time; includes put, update, and delete availability).

    Note: I'm sure Google would like downtime to occur during off-peak hours. Thus I expect they'll continue to try to minimize downtime, or schedule it for off-peak hours whenever possible.

  2. Downtime trending. The number of 15-minute periods during which the datastore has been unavailable has been decreasing. In the past 366 days, there were an average of 3.8 15-minute periods in which the datastore was unavailable per day. In the past 200 days, this has decreased by 60% to 2.3 per day. Write downtime over the past few months has actually been quite good - since March 1st, there have been less than 0.25 15-minute chunks of write downtime per day. Here's a graph of datastore write downtime:
    Downtime trending http://imagebin.ca/img/4wkHVQPc.png


Source of Answers

To answer your question, I wrote this script which extracts downtime data from GAE's Datastore Status page.


Graphs

Datastore write downtime from 2009-Jul-20 to 2010-Jul-20 (4 hour bins):

alt text http://imagebin.ca/img/p9ScWTm.png

Datastore write downtime from 2009-Jul-20 to 2010-Jul-20 (1 hour bins):

alt text http://imagebin.ca/img/9FbLut2G.png

Datastore downtime from 2009-Jul-20 to 2010-Jul-20 (4 hour bins):

alt text http://imagebin.ca/img/t3XKLk.png

Datastore downtime from 2010-Jan-01 to 2010-Jul-20 (4 hour bins):

alt text http://imagebin.ca/img/k36T9h.png


Raw data

(you can tweak the variables at the top of the script if you'd like to collect your own data with slightly different parameters):

# RAW Data: Each element counts the number of days in which the datastore
# was unavailable for at least some portion of a given 15-minute window. The
# first element corresponds to the time chunk from 00:00 to 00:15, and so on.
RESULTS_SINCE_2010JAN01_BIN15 = [0, 0, 0, 0, 3, 11, 3, 3, 3, 3, 12, 3, 3, 3, 4, 14, 4, 4, 4, 4, 12, 2, 2, 2, 2, 14, 4, 4, 4, 4, 11, 2, 2, 2, 2, 11, 5, 5, 5, 5, 13, 4, 4, 4, 4, 14, 7, 5, 5, 5, 14, 4, 3, 3, 3, 13, 2, 2, 2, 2, 12, 5, 4, 4, 4, 14, 5, 3, 3, 3, 12, 7, 2, 2, 2, 5, 5, 0, 0, 0, 2, 9, 3, 2, 2, 2, 10, 1, 1, 1, 2, 9, 3, 3, 3, 15]
RESULTS_SINCE_2009JUL20_BIN15 = [0, 0, 0, 0, 11, 21, 5, 5, 5, 5, 29, 6, 6, 6, 7, 38, 11, 11, 11, 11, 37, 7, 7, 7, 7, 44, 12, 12, 12, 12, 37, 10, 10, 10, 10, 34, 7, 7, 7, 7, 46, 11, 11, 11, 11, 39, 15, 13, 13, 13, 44, 13, 12, 12, 12, 44, 5, 5, 5, 5, 34, 11, 10, 10, 10, 40, 13, 11, 11, 11, 31, 21, 12, 12, 11, 19, 21, 4, 4, 4, 13, 28, 10, 9, 9, 16, 36, 10, 10, 10, 12, 32, 7, 7, 6, 35]
RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN15 = [0, 0, 0, 0, 4, 12, 4, 4, 4, 4, 22, 6, 6, 6, 7, 27, 7, 7, 7, 7, 21, 6, 6, 6, 6, 32, 9, 9, 9, 9, 26, 8, 8, 8, 8, 27, 7, 7, 7, 7, 30, 7, 7, 7, 7, 27, 10, 8, 8, 8, 28, 10, 9, 9, 9, 28, 4, 4, 4, 4, 21, 4, 4, 4, 4, 25, 6, 4, 4, 4, 18, 14, 9, 10, 9, 16, 17, 2, 2, 2, 8, 18, 7, 6, 6, 9, 19, 5, 5, 5, 6, 18, 5, 5, 4, 21]

# RESULTS DISTILLED FROM COLLECTED_RESULTS
RESULTS_SINCE_2010JAN01_BIN60 = [RESULTS_SINCE_2010JAN01_BIN15[i*4]+RESULTS_SINCE_2010JAN01_BIN15[i*4+1]+RESULTS_SINCE_2010JAN01_BIN15[i*4+2]+RESULTS_SINCE_2010JAN01_BIN15[i*4+3] for i in xrange(24)]
RESULTS_SINCE_2010JAN01_BIN240 = [RESULTS_SINCE_2010JAN01_BIN60[i*4]+RESULTS_SINCE_2010JAN01_BIN60[i*4+1]+RESULTS_SINCE_2010JAN01_BIN60[i*4+2]+RESULTS_SINCE_2010JAN01_BIN60[i*4+3] for i in xrange(6)]
RESULTS_SINCE_2010JAN01_BIN480 = [RESULTS_SINCE_2010JAN01_BIN60[i*2]+RESULTS_SINCE_2010JAN01_BIN60[i*2+1] for i in xrange(3)]
RESULTS_SINCE_2009JUL20_BIN60 = [RESULTS_SINCE_2009JUL20_BIN15[i*4]+RESULTS_SINCE_2009JUL20_BIN15[i*4+1]+RESULTS_SINCE_2009JUL20_BIN15[i*4+2]+RESULTS_SINCE_2009JUL20_BIN15[i*4+3] for i in xrange(24)]
RESULTS_SINCE_2009JUL20_BIN240 = [RESULTS_SINCE_2009JUL20_BIN60[i*4]+RESULTS_SINCE_2009JUL20_BIN60[i*4+1]+RESULTS_SINCE_2009JUL20_BIN60[i*4+2]+RESULTS_SINCE_2009JUL20_BIN60[i*4+3] for i in xrange(6)]
RESULTS_SINCE_2009JUL20_BIN480 = [RESULTS_SINCE_2009JUL20_BIN240[i*2]+RESULTS_SINCE_2009JUL20_BIN240[i*2+1] for i in xrange(3)]
RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN60 = [RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN15[i*4]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN15[i*4+1]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN15[i*4+2]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN15[i*4+3] for i in xrange(24)]
RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN240 = [RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN60[i*4]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN60[i*4+1]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN60[i*4+2]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN60[i*4+3] for i in xrange(6)]
RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN480 = [RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN240[i*2]+RESULTS_WRITE_DOWNTIME_SINCE_2009JUL20_BIN240[i*2+1] for i in xrange(3)]
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文