Google App Engine Bulkloader“意外线程死亡”
我正在尝试使用bulkloader功能将一个中等大小的csv文件上传到google应用程序引擎,它似乎在某种程度上死掉了,结果如下:
[INFO ] Logging to bulkloader-log-20110328.181531
[INFO ] Throttling transfers:
[INFO ] Bandwidth: 250000 bytes/second
[INFO ] HTTP connections: 8/second
[INFO ] Entities inserted/fetched/modified: 20/second
[INFO ] Batch Size: 10
[INFO ] Opening database: bulkloader-progress-20110328.181531.sql3
[INFO ] Connecting to notmyrealappname.appspot.com/_ah/remote_api
[INFO ] Starting import; maximum 10 entities per post
...............................................................[INFO ] Unexpected thread death: WorkerThread-7
[INFO ] An error occurred. Shutting down...
.........[ERROR ] Error in WorkerThread-7: <urlopen error [Errno -2] Name or service not known>
[INFO ] 1740 entites total, 0 previously transferred
[INFO ] 720 entities (472133 bytes) transferred in 32.3 seconds
[INFO ] Some entities not successfully transferred
它上传了我试图上传的19k条目中的大约700个,并且我我想知道为什么它失败了。我检查了 csv 文件是否有错误,例如可能会导致 python csv 阅读器无法使用的额外逗号,并且非 ascii 字符已被删除。
I am trying to upload a moderetly sized csv file to google app engine using the bulkloader functionality, and it appears to die some of the way through with the following result:
[INFO ] Logging to bulkloader-log-20110328.181531
[INFO ] Throttling transfers:
[INFO ] Bandwidth: 250000 bytes/second
[INFO ] HTTP connections: 8/second
[INFO ] Entities inserted/fetched/modified: 20/second
[INFO ] Batch Size: 10
[INFO ] Opening database: bulkloader-progress-20110328.181531.sql3
[INFO ] Connecting to notmyrealappname.appspot.com/_ah/remote_api
[INFO ] Starting import; maximum 10 entities per post
...............................................................[INFO ] Unexpected thread death: WorkerThread-7
[INFO ] An error occurred. Shutting down...
.........[ERROR ] Error in WorkerThread-7: <urlopen error [Errno -2] Name or service not known>
[INFO ] 1740 entites total, 0 previously transferred
[INFO ] 720 entities (472133 bytes) transferred in 32.3 seconds
[INFO ] Some entities not successfully transferred
It uploads about 700 of the 19k entries I am trying to upload, and I am wondering why it fails. I checked the csv file for errors like additional commas that could throw off the python csv reader and non ascii characters have been stripped out.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
提升批量限制 (batch_size) 和 rps 限制 (rps_limit) 是有效的,我使用 1000 作为批量大小,将 rps 限制设置为 500:
Lifting the batch limit (batch_size) and rps limit (rps_limit) works, I use 1000 for the batch size and an rps limit of 500: