Google App Engine Bulkloader“意外线程死亡”

发布于 2024-10-27 16:28:43 字数 1101 浏览 1 评论 0原文

我正在尝试使用bulkloader功能将一个中等大小的csv文件上传到google应用程序引擎,它似乎在某种程度上死掉了,结果如下:

[INFO    ] Logging to bulkloader-log-20110328.181531
[INFO    ] Throttling transfers:
[INFO    ] Bandwidth: 250000 bytes/second
[INFO    ] HTTP connections: 8/second
[INFO    ] Entities inserted/fetched/modified: 20/second
[INFO    ] Batch Size: 10
[INFO    ] Opening database: bulkloader-progress-20110328.181531.sql3
[INFO    ] Connecting to notmyrealappname.appspot.com/_ah/remote_api
[INFO    ] Starting import; maximum 10 entities per post
...............................................................[INFO    ] Unexpected thread death: WorkerThread-7
[INFO    ] An error occurred. Shutting down...
.........[ERROR   ] Error in WorkerThread-7: <urlopen error [Errno -2] Name or service not known>

[INFO    ] 1740 entites total, 0 previously transferred
[INFO    ] 720 entities (472133 bytes) transferred in 32.3 seconds
[INFO    ] Some entities not successfully transferred

它上传了我试图上传的19k条目中的大约700个,并且我我想知道为什么它失败了。我检查了 csv 文件是否有错误,例如可能会导致 python csv 阅读器无法使用的额外逗号,并且非 ascii 字符已被删除。

I am trying to upload a moderetly sized csv file to google app engine using the bulkloader functionality, and it appears to die some of the way through with the following result:

[INFO    ] Logging to bulkloader-log-20110328.181531
[INFO    ] Throttling transfers:
[INFO    ] Bandwidth: 250000 bytes/second
[INFO    ] HTTP connections: 8/second
[INFO    ] Entities inserted/fetched/modified: 20/second
[INFO    ] Batch Size: 10
[INFO    ] Opening database: bulkloader-progress-20110328.181531.sql3
[INFO    ] Connecting to notmyrealappname.appspot.com/_ah/remote_api
[INFO    ] Starting import; maximum 10 entities per post
...............................................................[INFO    ] Unexpected thread death: WorkerThread-7
[INFO    ] An error occurred. Shutting down...
.........[ERROR   ] Error in WorkerThread-7: <urlopen error [Errno -2] Name or service not known>

[INFO    ] 1740 entites total, 0 previously transferred
[INFO    ] 720 entities (472133 bytes) transferred in 32.3 seconds
[INFO    ] Some entities not successfully transferred

It uploads about 700 of the 19k entries I am trying to upload, and I am wondering why it fails. I checked the csv file for errors like additional commas that could throw off the python csv reader and non ascii characters have been stripped out.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

迷鸟归林 2024-11-03 16:28:43

提升批量限制 (batch_size) 和 rps 限制 (rps_limit) 是有效的,我使用 1000 作为批量大小,将 rps 限制设置为 500:

appcfg.py upload_data --url= --application= --filename=  --email= --batch_size=1000 --rps_limit=500

Lifting the batch limit (batch_size) and rps limit (rps_limit) works, I use 1000 for the batch size and an rps limit of 500:

appcfg.py upload_data --url= --application= --filename=  --email= --batch_size=1000 --rps_limit=500
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文