RAM溢出加载时间长,SQL查询大数据
我有一个现有数据库,需要从中提取一条总共包含 10 GB 数据的记录。我尝试使用
conn = sqlite(databaseFile, 'readonly')
GetResult = [
'SELECT result1, result2 ...... FROM Result '...
'WHERE ResultID IN ......'
];
Data = fetch(conn, GetResult)
此查询加载数据,工作内存增加(16GB)直到满,然后软件崩溃。 限制结果
'LIMIT 10000'
我还尝试在查询末尾 并按偏移量浏览结果。这可行,但需要 3 小时(根据 20 个单独的结果计算)才能获得所有结果。 (数据库无法更改)
也许你们中的某个人有一个想法,可以更快地或在一次查询中获取数据。
I have an existing database from which I need to extract a single record that contains a total of 10 GB of data. I have tried to load the data with
conn = sqlite(databaseFile, 'readonly')
GetResult = [
'SELECT result1, result2 ...... FROM Result '...
'WHERE ResultID IN ......'
];
Data = fetch(conn, GetResult)
With this query, the working memory increases (16GB) until it is full, and then the software crashes.
I also tried to limit the result with
'LIMIT 10000'
at the end of the query and browse the results by offset. This works, but it takes 3 hours (calculated from 20 individual results) to get all the results. (Database can not be changed)
Maybe someone of you has an idea to get the data faster or in one query.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论