iPhone 的 sqlite 与 csv 文件
我们的应用程序中下载了大约 10 个 sqlite 文件,每个文件包含大约 4000 行。我们处理该数据并将其显示在表格视图中。滚动表格视图时,我们遇到了速度和内存问题。
我们在想,如果我们有 csv 文件或其他格式的文件,而不是 sqlite 文件,是否可以获得比 sqlite 更好的性能?我读到 xml 或 json 没有帮助,因为记录数量太大并且解析时间会增加。
请建议。
We have about 10 sqlite files getting downloaded in our app and each of which contains about 4000 rows. We process that data and display it in a tableview. We are running into speed and memory issues when scrolling through the tableview.
We were thinking whether instead of sqlite files, if we have csv files or some other format, can we get better performance than sqlite? I have read that xml or json won't help since the number of records is too huge and parsing time would go up.
Please suggest.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
首先,不要假设 SQLite 是您的瓶颈。我在自己的应用程序中做了同样的假设,并花了几天时间尝试优化数据库访问,只是针对它运行 Instruments 并发现我的界面中有一个缓慢的字符串处理例程,这使事情陷入困境。
首先使用时间分析器和对象分配来验证热点在代码中的位置。 SQLite 快得离谱。
也就是说,对于 4000 行,如果您尝试将所有行加载到数组中以显示在屏幕上,您可能至少会遇到内存问题。我的建议是将该数据导入到 Core Data SQLite 数据库中,并使用 NSFetchedResultsController,其批量大小设置为使其获取请求略大于屏幕上显示的行数。
Core Data 将以这种方式处理批量数据的加载/卸载,这意味着一次只有数据库的一小部分加载到内存中。这可以带来巨大的加速(特别是在初始加载时),并将显着减少内存使用量。它还使用少量代码来完成此操作。
First, don't assume that SQLite is your bottleneck. I made that same assumption in my own application and spent days trying to optimize the database access, only to run Instruments against it and find that I had a slow string-processing routine in my interface that was bogging things down.
Use Time Profiler and Object Allocations first to verify where your hotspots are in code. SQLite is ridiculously fast.
That said, with 4000 rows, you will probably run into memory issues at the least if you try to load all of them into an array for display to the screen. My recommendation would be to import that data into a Core Data SQLite database and use an NSFetchedResultsController with a batch size set for its fetch request to be slightly larger than the number of rows displayed onscreen.
Core Data will handle the loading / unloading of batched data this way, meaning that only a small part of the database is loaded into memory at once. This can lead to a tremendous speedup (particularly on the initial load) and will significantly reduce memory usage. It also does it using a trivial amount of code.
正确索引的 SQLite 数据库将围绕任何平面文件运行,尤其是当您有大量记录时。还可以尝试将这 10 个文件合并到 1 个数据库中,这样您就可以对索引列执行联接并使用视图等巧妙技巧。现在看来您正在从 10 个不同的数据库中提取数据并手动比较/处理它们,这当然会花费大量的时间和内存。
A properly indexed SQLite database will run circles around any flat file, especially if you have a lot of records. Also try consolidating those 10 files into 1 database, so you can perform joins on indexed columns and use clever tricks such as views. Right now it seems like you're pulling data from 10 different databases and manually comparing/processing them, which would of course take a lot of time and memory.
这将取决于应用程序、您如何使用和查询数据。对它进行分析,确认 sqlite 是否存在问题。然后攻击出现的任何分析结果。
分析器:Shark
或者一些其他分析解决方案
It is going to depend on the application, how you are using and querying the data. Profile it, confirm that sqlite is or isn't the problem. Then attack whatever the profiling turns up.
Profilers: Shark
Or some other profiling solution