加快数据库写入速度
我有一个 ListActivity,它使用 CursorAdapter 来显示一些数据。此数据是 Web 服务调用的结果。
我从 Web 服务调用获取响应,使用 org.json.* 库对其进行解析,然后将结果写入应用程序的 SQLite3 数据库。然后重新查询 ListActivity 的 Cursor,数据显示在列表中。
我的问题是数据库写入速度太慢。我唯一能想到的就是不使用 CursorAdapter 并将这些数据保存在内存中。我希望有人能提出另一个建议来加快速度。也许是某种批量插入?
应该注意的是,我使用 ContentProvider 来进行插入。所以我调用 getContentResolver().insert(...)。
以下是通过 LAN 检索 56 行数据并显示它们的测试的一些时间:
响应时间:178 毫秒
解析 json 的时间:16 毫秒
将 56 行写入数据库的时间:5714 毫秒
我最终希望数据库的时间写入此数据量的时间低于 1000 毫秒。
I have a ListActivity that uses a CursorAdapter to display some data. This data is the result of a web service call.
I am getting the response from the web service call, using the org.json.* library to parse it, and then writing the results to the app's SQLite3 database. The ListActivity's Cursor is then re-queried and the data shows in the list.
My problem is the database writing is excessively slow. The only thing I can think to do is to not use a CursorAdapter and just hold this data in memory. I was hoping someone had another suggestion to speed things up. Perhaps a bulk insert of some kind?
It should be noted that I'm using a ContentProvider to do my inserting. So I call getContentResolver().insert(...).
Here are some times from a test that retrieved in 56 rows of data over LAN and displayed them:
Time to response: 178ms
Time to parse json: 16ms
Time to write 56 rows to the database: 5714ms
I'd ultimately like the time for database writing to be under 1000ms for this amount of data.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
我曾经遇到过一些问题,发现首先在数据库上调用 beginTransaction() 然后执行插入查询。示例代码:
这将插入 100 行(在这种情况下批量的大小)从大约 30 秒减少到 300 毫秒左右。
I had the some problem once and found that calling beginTransaction() on your database first followed by executing your insert queries. Example code:
this reduced inserting 100 rows (the size of a batch in this situation) from about 30s to 300ms or so.
您是否会对写入数据库的每一行进行一次往返,或者是否可以将所有这些行批处理为一次往返?我敢打赌,如果您不进行批处理,网络延迟就是罪魁祸首。
Do you make a round trip for every row you write to the database, or can you batch all of them into a single round trip? I'd bet that network latency is the culprit if you're not batching.