是否可以增加 Google App Engine 中的响应超时?

发布于 2024-09-03 10:17:47 字数 179 浏览 8 评论 0原文

在我的本地计算机上,脚本运行良好,但在云中它始终为 500。这是一个 cron 任务,所以我真的不介意它是否需要 5 分钟

...... class 'google.appengine.runtime.DeadlineExceededError' >:

知道是否可以增加超时吗?

谢谢, 瑞

On my local machine the script runs fine but in the cloud it 500 all the time. This is a cron task so I don't really mind if it takes 5min...

< class 'google.appengine.runtime.DeadlineExceededError' >:

Any idea whether it's possible to increase the timeout?

Thanks,
rui

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

铜锣湾横着走 2024-09-10 10:17:47

您不能超过 30 秒,但您可以通过使用任务队列来间接增加超时 - 并编写逐渐迭代数据集并处理它的任务。每个这样的任务运行当然应该符合超时限制。

编辑

更具体地说,您可以使用数据存储区查询游标在同一位置恢复处理:

http://code.google.com/intl/pl/appengine/docs/python/datastore/queriesandindexes.html#Query_Cursors

在 SDK 1.3.1 中首次引入:

< a href="http://googleappengine.blogspot.com/2010/02/app-engine-sdk-131-include-major.html" rel="nofollow noreferrer">http://googleappengine.blogspot.com/2010 /02/app-engine-sdk-131-include-major.html

You cannot go beyond 30 secs, but you can indirectly increase timeout by employing task queues - and writing task that gradually iterate through your data set and processes it. Each such task run should of course fit into timeout limit.

EDIT

To be more specific, you can use datastore query cursors to resume processing in the same place:

http://code.google.com/intl/pl/appengine/docs/python/datastore/queriesandindexes.html#Query_Cursors

introduced first in SDK 1.3.1:

http://googleappengine.blogspot.com/2010/02/app-engine-sdk-131-including-major.html

还如梦归 2024-09-10 10:17:47

数据库查询超时的确切规则很复杂,但似乎查询的生存时间不能超过 2 分钟,批处理的生存时间不能超过 30 秒。下面是一些代码,它将一个作业分成多个查询,使用游标来避免这些超时。

def make_query(start_cursor):
  query = Foo()

  if start_cursor:
    query.with_cursor(start_cursor)

  return query

batch_size = 1000
start_cursor = None

while True:
  query = make_query(start_cursor)
  results_fetched = 0

  for resource in query.run(limit = batch_size):
    results_fetched += 1

    # Do something

    if results_fetched == batch_size:
      start_cursor = query.cursor()
      break
  else:
    break

The exact rules for DB query timeouts are complicated, but it seems that a query cannot live more than about 2 mins, and a batch cannot live more than about 30 seconds. Here is some code that breaks a job into multiple queries, using cursors to avoid those timeouts.

def make_query(start_cursor):
  query = Foo()

  if start_cursor:
    query.with_cursor(start_cursor)

  return query

batch_size = 1000
start_cursor = None

while True:
  query = make_query(start_cursor)
  results_fetched = 0

  for resource in query.run(limit = batch_size):
    results_fetched += 1

    # Do something

    if results_fetched == batch_size:
      start_cursor = query.cursor()
      break
  else:
    break
单挑你×的.吻 2024-09-10 10:17:47

下面是我用来解决这个问题的代码,将一个大查询分解为多个小查询。我使用 google.appengine.ext.ndb 库 - 我不知道下面的代码是否需要该库才能工作。

(如果您没有使用 ndb,请考虑改用它。它是 db 库的改进版本,迁移到它很容易。有关详细信息,请参阅 https://developers.google.com/appengine/docs/python/ndb。)

from google.appengine.datastore.datastore_query import Cursor

def ProcessAll():
  curs = Cursor()
  while True:
    records, curs, more = MyEntity.query().fetch_page(5000, start_cursor=curs)
    for record in records:
      # Run your custom business logic on record.
      RunMyBusinessLogic(record)
    if more and curs:
      # There are more records; do nothing here so we enter the 
      # loop again above and run the query one more time.
      pass
    else:
      # No more records to fetch; break out of the loop and finish.
      break

Below is the code I use to solve this problem, by breaking up a single large query into multiple small ones. I use the google.appengine.ext.ndb library -- I don't know if that is required for the code below to work.

(If you are not using ndb, consider switching to it. It is an improved version of the db library and migrating to it is easy. For more information, see https://developers.google.com/appengine/docs/python/ndb.)

from google.appengine.datastore.datastore_query import Cursor

def ProcessAll():
  curs = Cursor()
  while True:
    records, curs, more = MyEntity.query().fetch_page(5000, start_cursor=curs)
    for record in records:
      # Run your custom business logic on record.
      RunMyBusinessLogic(record)
    if more and curs:
      # There are more records; do nothing here so we enter the 
      # loop again above and run the query one more time.
      pass
    else:
      # No more records to fetch; break out of the loop and finish.
      break
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文