I have three different datasource to work with in this batch. Everything running fine. Job Repository is map-based and using ResourcelessTransactionManager for it. I configured it like this
@Configuration
@EnableBatchProcessing
public class BatchConfigurer extends DefaultBatchConfigurer {
@Override
public void setDataSource(DataSource dataSource){
}
}
I also use different platformtransactionmanager then spring batch (issue). So I set my spring allow bean overriding to true in my properties.
Now my problem is, I want to update my record's status that is essential for start my batch according to job's exit status. So I use job execution listener to achieve that. But while everything working great in my local, it is throwing error in our remote server(k8 env), which is making it more interesting.
The part where I trying to update my record is
@Slf4j
@Component
@Lazy
public class MyListener implements JobExecutionListener {
@Autowired
private MyRepo myRepo;
@Override
public void beforeJob(JobExecution jobExecution) {
//before job
}
@Override
public void afterJob(JobExecution jobExecution) {
myRepo.saveAndFlush(myUpdatedEntity); // I cannot share all code because of my company's policy but there is no issue in here
}
}
The error I get is
javax.persistence.TransactionRequiredException: no transaction is in progress
As I know, spring batch is not handling this transaction. I already have transaction manager for it. Like I said, it's working in my local, so there shouldn't be any configuration issue. I tried to add @Transactional(transactionManager = "myTransactionManager") to ensure it, didn't work. What do you think?
edit: I defined my transaction manager as
@Configuration
@EnableJpaRepositories(
basePackages = "repo's package",
entityManagerFactoryRef = "entityManagerFactory",
transactionManagerRef = "transactionManager"
)
public class DatasourceConfiguration {
@Bean(name = "transactionManager")
public PlatformTransactionManager transactionManager(
@Qualifier("entityManagerFactory") EntityManagerFactory entityManagerFactory ) { // I defined these (datasource etc.)
return new JpaTransactionManager(entityManagerFactory);
}
}
edit 2:
Setting hibernate.allow_update_outside_transaction
to true resolved the issue but I have some concerns about it. Could it effect rollback of chunk when error accoured? I suppose not because it has it's own transaction but I need to be sure. And I couldn't fully understand why it happens.
CodePudding user response:
Since you are using JPA, you need to configure the job repository as well as the step to use a JpaTransactionManager
.
For the job repository, you need to override BatchConfigurer#getTransactionManager
as mentioned in the documentation here: https://docs.spring.io/spring-batch/docs/4.3.7/reference/html/job.html#javaConfig.
For the step, you can set the transaction manager using the builder:
@Bean
public Step step(JpaTransactionManager transactionManager) {
return this.stepBuilderFactory.get("step")
// configure step type and other properties
.transactionManager(transactionManager)
.build();
}
EDIT: Add transaction details about JobExecutionListener
JobExecutionListener#afterJob
is executed outside the transaction driven by Spring Batch for the step. So if you want to execute transactional code inside that method, you need to manage the transaction yourself. You can do that either declaratively by adding @Transactional(transactionManager = "jpaTransactionManager", propagation = Propagation.REQUIRES_NEW)
on your repository method, or programmatically with a TransactionTemplate
, something like:
@Override
public void afterJob(JobExecution jobExecution) {
new TransactionTemplate(transactionManager, transactionAttribute).execute(new TransactionCallbackWithoutResult() {
@Override
protected void doInTransactionWithoutResult(TransactionStatus status) {
myRepo.saveAndFlush(myUpdatedEntity);
}
});
}