弹簧批量仅在块大小设置为1的情况下工作,否则会导致OptimisticlockingFailureException?
我正在使用块型号实施弹簧批量。项目读取器要从列表中阅读,项目处理器以执行业务逻辑和项目作者,以最终写入数据库。
批处理处理在配置块大小为1时效果很好,但跑到OppertisticlockingFailureException中,同时增加了块尺寸。 org.springframework.dao.poptimisticlockingfailureexception:尝试更新步骤执行ID = xxxxxx,带有错误的版本(x),其中当前版本为y。
这是我的春季批处理配置。
@Bean(BatchJob.BATCH_JOB)
public Job importJob(@Qualifier(BatchJob.READER) ItemReader<Details> reader,
@Qualifier(BatchJob.WRITER) ItemWriter<Details> writer,
@Qualifier(BatchJob.PROCESSOR) ItemProcessor<Details,Details> processor,
@Qualifier(BatchJob.TASK_EXECUTOR) TaskExecutor taskExecutor) {
final Step writeToDatabase = stepBuilderFactory.get(BatchJob.BATCH_STEP)
.<Details, Details>chunk(chunkSize)
.reader(reader)
.processor(processor)
.writer(writer)
.transactionManager(transactionManager)
.taskExecutor(taskExecutor)
.throttleLimit(throttleLimit)
.build();
return jobBuilderFactory.get(BatchJob.JOB_BUILDER_FACTORY)
.incrementer(new RunIdIncrementer())
.start(writeToDatabase)
.build();
}
我对ItemProsessor作为空的ItemProcessor的怀疑,即使没有操作的数量较高,块尺寸也可以正常工作。 @transactional(podagation = papagation.requires_new)
已在我们的业务逻辑中使用。
可能是因为春季批处理的交易管理以及我们的业务逻辑或任何其他可能的原因?
使用以块尺寸为1的弹簧批次有任何优势
I am implementing Spring batch using chunk model. Item reader to read from the list, Item processor to perform business logic and Item writer to finally write to database.
Batch processing works fine when configuring chunk size as 1 but ran into OptimisticLockingFailureException while increasing chunk size. org.springframework.dao.OptimisticLockingFailureException: Attempt to update step execution id=XXXXXX with wrong version (X), where current version is Y.
Here is my Spring batch configuration.
@Bean(BatchJob.BATCH_JOB)
public Job importJob(@Qualifier(BatchJob.READER) ItemReader<Details> reader,
@Qualifier(BatchJob.WRITER) ItemWriter<Details> writer,
@Qualifier(BatchJob.PROCESSOR) ItemProcessor<Details,Details> processor,
@Qualifier(BatchJob.TASK_EXECUTOR) TaskExecutor taskExecutor) {
final Step writeToDatabase = stepBuilderFactory.get(BatchJob.BATCH_STEP)
.<Details, Details>chunk(chunkSize)
.reader(reader)
.processor(processor)
.writer(writer)
.transactionManager(transactionManager)
.taskExecutor(taskExecutor)
.throttleLimit(throttleLimit)
.build();
return jobBuilderFactory.get(BatchJob.JOB_BUILDER_FACTORY)
.incrementer(new RunIdIncrementer())
.start(writeToDatabase)
.build();
}
I have my doubt on ItemProcessor as empty ItemProcessor with no operation works fine even with higher value of chunk size. @Transactional(propagation = Propagation.REQUIRES_NEW)
has been used in our business logic.
Can it be because of the transaction management of Spring batch and that used in our business logic or any other possible reasons?
Is there any advantage of using spring batch with chunk size set as 1 and performing same business logic by normal loop one by one.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论