Twilio client.messages.stream() 返回无限结果?
所以我知道我的 Twilio 帐户已发送了超过 50,000 条短信。但是,当我运行以下命令时:
for item in client.messages.stream():
i+=1
lst.append([item.body.replace('|',''),item.from_,item.date_sent])
if i % 100 == 0:
print(i)
它只是不断运行。我最初使用的是 client.messages.list,但是 lambda 超时了 1 分钟、5 分钟、10 分钟,所以我决定在本地进行调试,然后运行上面的代码。当它达到 230,000 条时我就停止了它,这比我们实际发送的消息多了很多倍。
我不太明白它为什么要这样做?文档对此没有说什么?如果它所做的只是一遍又一遍地返回同一页面,我在文档中找不到告诉流继续前进的方法。
不过,它似乎并没有返回同一页面 - 当我打印第一百页的消息正文时,它经常发生变化。
so I know my Twilio account has sent upwards of 50,000 texts. However, when I run the following:
for item in client.messages.stream():
i+=1
lst.append([item.body.replace('|',''),item.from_,item.date_sent])
if i % 100 == 0:
print(i)
It just keeps running and running. I was originally using client.messages.list
, but that hit my 1 minute, then 5 minute, then 10 minute timeout in lambda, so I decided to debug locally, and run the above. I stopped it after it had gotten to 230,000, which is many multiple more messages than we've actually sent.
I don't quite know why it's doing that? The docs don't say anything about this? I can't find in the docs either a way to tell the stream to move on, if what it's doing is just returning the same page over and over.
It doesn't appear to be returning the same page, though - when I print the message body for the hundredth one it changes every so often.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
client.messages.stream():
将返回您已发送以及收到的所有消息。您可以通过以下方式过滤列表如果您只想检索出站消息,则为发送消息的号码。
如果您希望限制所获得的结果,可以设置
限制
。如果您想加快这样的长请求速度,可以将pageSize
增加到 1000(默认值为 50)。client.messages.stream():
will return all the messages you have sent as well as received.You can filter the list by the number the number a message was sent from, if you are only looking to retrieve your outbound messages.
If you are looking to limit the results you are getting, you can set a
limit
. If you want to speed long requests like this up, you can increase thepageSize
to a 1000 (default is 50).