当批量加载器花费太长时间/生成错误时,将 CSV 数据导入 App Engine 的最佳方法是什么?
我有一个 10 MB 的地理位置数据 CSV 文件,我昨天尝试将其上传到我的 App Engine 数据存储区。我按照这篇博文中的说明 并使用了bulkloader/appcfg 工具。数据存储表明记录已上传,但花了几个小时,并耗尽了我当天的全部 CPU 配额。在我真正超出配额之前,这个过程在接近尾声时出现了错误。但不用说,10 MB 的数据不应该需要这么多的时间和精力。
那么,是否有其他方法可以将此 CSV 数据放入我的 App Engine 数据存储区(对于 Java 应用程序)。
我看到 Ikai Lan 发表的一篇关于使用他为此目的创建的映射器工具的帖子,但它看起来相当复杂。
相反,将 CSV 上传到 Google Docs 怎么样?有没有办法将其从那里传输到 App Engine 数据存储区?
I have a 10 MB CSV file of Geolocation data that I tried to upload to my App Engine datastore yesterday. I followed the instructions in this blog post and used the bulkloader/appcfg tool. The datastore indicated that records were uploaded but it took several hours and used up my entire CPU quota for the day. The process broke down in errors towards the end before I actually exceeded my quota. But needless to say, 10 MB of data shouldn't require this much time and power.
So, is there some other way to get this CSV data into my App Engine datastore (for a Java app).
I saw a post by Ikai Lan about using a mapper tool he created for this purpose but it looks rather complicated.
Instead, what about uploading the CSV to Google Docs - is there a way to transfer it to the App Engine datastore from there?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
我每天通过 Bulkloader 上传 100000 条记录(20 兆)。我玩过的设置:
-bulkloader.yaml 配置:设置为自动生成密钥。
- 在原始 csv 文件中包含标题行。
- 速度参数设置为最大(不确定减少是否会减少 CPU 消耗)
这些设置在大约 4 分钟内耗尽了我 6.5 小时的免费配额 - 但它加载了数据(可能是从正在生成的索引中获取的)。
(我使用脚本自动生成这一行,并使用 Autohotkey 发送我的凭据)。
I do daily uploads of 100000 records (20 megs) through the bulkloader. Settings I played with:
- bulkloader.yaml config: set to auto generate keys.
- include header row in raw csv file.
- speed parameters are set on max (not sure if reducing would reduce cpus consumed)
These settings burn through my 6.5 hrs of free quota in about 4 minutes -- but it gets the data loaded (maybe its' from the indexes being generated).
(I autogenerate this line with a script and use Autohotkey to send my credentials).
我编写了这个 gdata 连接器来从 Google Docs 电子表格中提取数据并将其插入到数据存储中,但它使用 Bulkloader,因此它会让您回到问题所在。
http://code.google.com/p/bulkloader -gdata-connector/source/browse/gdata_connector.py
但是,您可以做的是查看源代码,看看我如何从 gdocs 中提取数据并创建一个执行此操作的任务,而不是通过散装装载机。
您还可以将文档上传到 blobstore,并类似地创建一个从 blobstore 读取 csv 数据并创建实体的任务。 (我认为这比使用 gdata feed 更容易、更快)
I wrote this gdata connector to pull data out of a Google Docs Spreadsheet and insert it into the datastore, but it uses Bulkloader, so it kind of takes you back to square one of your problem.
http://code.google.com/p/bulkloader-gdata-connector/source/browse/gdata_connector.py
What you could do however is take a look at the source to see how I pull data out of gdocs and create a task(s) that does that, instead of going through bulkloader.
Also you could upload your document into the blobstore and similarly create a task that reads csv data out of blobstore and creates entities. (I think this would be easier and faster than working with gdata feeds)