如何在不循环的情况下将 Java 中的结果集拆分为块(每个块 500 行)?
我有一个查询返回结果集中保存的数百万条记录。我必须处理(插入/更新)这些记录。我不想一次性插入所有记录,而是希望将结果集拆分为每个 500 条记录的块,然后存储在 ArrayList 或 Vector 中,并按某个时间处理这 500 条记录。时间。
如何将结果集分成块和块?存储在 ArrayList
或 Vector
中而不循环遍历百万条记录?
我找到了答案,必须使用 CachedRowSet 而不是结果集。并使用 set setPageSize
CachedRowSet crs = new CachedRowSetImpl();
crs.setPageSize(500);
crs.execute(conHandle);
,然后使用
while(crs.nextPage()) {
collection obj = crs.toCollections();
}
它可以确保我们可以将大数据处理成较小的块,
但我在这里有一个问题,如何通过传递连接对象来填充 crs ,其中提到查询字符串?
I have a query which returns millions of records which are held in a result set. I have to process (insert/update) these records. Instead of inserting all the record at once I would like to split the the resultset into chunks of 500 records each and store in an ArrayList
or Vector
and process these 500 records at a time.
How do I split the resultset into chunks & store in an ArrayList
or Vector
without looping through the million records?
i found the answer , got to use CachedRowSet instead of resultset. and use set setPageSize
CachedRowSet crs = new CachedRowSetImpl();
crs.setPageSize(500);
crs.execute(conHandle);
and then use
while(crs.nextPage()) {
collection obj = crs.toCollections();
}
this would ensure that a we could process large data into smaller chunks
but i have a dought here how would the crs populate by passing a connection object where do mention the query string ??
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
取决于您的 SQL 方言。例如,在 PostgreSQL 中,SELECT 有
OFFSET
和LIMIT
子句:您仍然需要某种循环来生成查询来获取所有数百万条记录。
Depends on your SQL dialect. For example, in PostgreSQL there are
OFFSET
andLIMIT
clauses for SELECT:You still will need some kind of loop to generate queries to fetch all millions of your records.
您必须在单个查询中获取数据,因为在多用户环境中的多个查询中不会获得相同的结果。
如果客户端内存是问题 - 首先将查询结果 bcp 输出到文件中,然后使用 unix split 命令之类的命令拆分文件。
您可以在工作表中按文件或 bcp 解析分割的数据文件,并将数据加载到 ArrayList 中。
You have to fetch you data in the single query because you will not get the same result in multiple queries in multiuser environment.
If the client memory is the issue - bcp out the query result in file at the first and split your file with something like unix split command.
You can parse your splited data file by file or bcp in in a working table and load data in the your ArrayList.