I tried to implement an example of Spring Boot Batch Processing from db to csv. The issue which I cannot solve is based on not sorting all values by id in csv file as well as showing column titles.
Here is the output in csv file. (First value is related with id)
3,9,2013-04-15,Japan,[email protected],Jard,MALE,Heino,87cdda81-45d0-451a-a62f-f8450eae1b64
2,23,1999-01-25,Panama,[email protected],Natala,FEMALE,Loynes,24be24e6-525f-42de-855d-52d4fef21608
6,9,2013-02-16,China,[email protected],Roseline,FEMALE,Cossans,06f70b0d-2c98-4f46-b933-528499ab91b3
4,8,2014-05-09,Indonesia,[email protected],Jilleen,FEMALE,Carlaw,c722e6d5-9024-49c5-80e0-c2555f1eb9cc
1,22,2000-08-15,China,[email protected],Ginnie,FEMALE,Spearing,fa26fa96-97d3-4e8e-856a-fdf07499e13e
5,22,2000-03-18,Indonesia,[email protected],Rainer,MALE,Gillino,5302a199-f313-4a24-9550-d643001d9faf
I want all values are sorted by id.
How can I do that?
Here is the batch configuration file shown below.
@Configuration // Informs Spring that this class contains configurations
@EnableBatchProcessing // Enables batch processing for the application
@RequiredArgsConstructor
public class BatchConfiguration {
private final JobBuilderFactory jobBuilderFactory;
private final StepBuilderFactory stepBuilderFactory;
private final UserRepository userRepository;
Date now = new Date(); // java.util.Date, NOT java.sql.Date or java.sql.Timestamp!
String format1 = new SimpleDateFormat("yyyy-MM-dd'-'HH-mm-ss-SSS",Locale.forLanguageTag("tr-TR")).format(now);
private Resource outputResource = new FileSystemResource("output/customers_" format1 ".csv");
@Bean
public RepositoryItemReader<User> reader(){
RepositoryItemReader<User> repositoryItemReader = new RepositoryItemReader<>();
repositoryItemReader.setRepository(userRepository);
repositoryItemReader.setMethodName("findAll");
final HashMap<String, Sort.Direction> sorts = new HashMap<>();
sorts.put("id", Sort.Direction.ASC);
repositoryItemReader.setSort(sorts);
return repositoryItemReader;
}
@Bean
public FlatFileItemWriter<User> writer() {
FlatFileItemWriter<User> writer = new FlatFileItemWriter<>();
writer.setResource(outputResource);
writer.setAppendAllowed(true);
writer.setLineAggregator(new DelimitedLineAggregator<User>() {
{
setDelimiter(",");
setFieldExtractor(new BeanWrapperFieldExtractor<User>() {
{
setNames(new String[]{"id", "age", "birthday", "country", "email", "firstName", "gender", "lastName", "personId"});
}
});
}
});
return writer;
}
@Bean
public UserProcessor processor() {
return new UserProcessor();
}
@Bean
public UserJobExecutionNotificationListener stepExecutionListener() {
return new UserJobExecutionNotificationListener(userRepository);
}
@Bean
public UserStepCompleteNotificationListener jobExecutionListener() {
return new UserStepCompleteNotificationListener();
}
@Bean
public Step step1() {
return stepBuilderFactory.get("csv-step").<User, User>chunk(10)
.reader(reader())
.processor(processor())
.writer(writer())
.listener(stepExecutionListener())
.taskExecutor(taskExecutor())
.build();
}
@Bean
public Job runJob() {
return jobBuilderFactory.get("importuserjob")
.listener(jobExecutionListener())
.flow(step1()).end().build();
}
@Bean
public TaskExecutor taskExecutor() {
SimpleAsyncTaskExecutor asyncTaskExecutor = new SimpleAsyncTaskExecutor();
asyncTaskExecutor.setConcurrencyLimit(10);
return asyncTaskExecutor;
}
}
Here is the link of example : Link
CodePudding user response:
After I added FlatFileHeaderCallback into FlatFileItemWriter, I fixed the issue.
Here is the code snippets shown below.
writer.setHeaderCallback(new FlatFileHeaderCallback() {
@Override
public void writeHeader(Writer writer) throws IOException {
for(int i=0;i<headers.length;i ){
if(i!=headers.length-1)
writer.append(headers[i] ",");
else
writer.append(headers[i]);
}
}
});