0

I have a Spring boot app that have a Spring batch, I'm using the batch to convert from CSV to in memory H2 database, if I start the batch when I start the application, everything runs smoothly, but I need to upload the CSV file then start the batch process, but when I do that, the application restarts after job Completion and I lost the data because I use in memory database.

I tried to set the TaskExecutor as Async but the problem is still there.

jobLauncher.run(importUserJob, new JobParametersBuilder()
                        .addString("fileName", multipartFile.getOriginalFilename())
                        .toJobParameters());



2019-06-18 23:07:59.008  INFO 89571 --- [nio-8080-exec-2] o.s.b.c.l.support.SimpleJobLauncher      : Job: [SimpleJob: [name=attendanceReadJob]] completed with the following parameters: [{fileName=medAttend.csv}] and the following status: [COMPLETED]
2019-06-18 23:08:00.416  INFO 89571 --- [       Thread-4] j.LocalContainerEntityManagerFactoryBean : Closing JPA EntityManagerFactory for persistence unit 'default'
2019-06-18 23:08:00.421  INFO 89571 --- [       Thread-4] o.s.b.f.support.DisposableBeanAdapter    : Invocation of destroy method failed on bean with name 'inMemoryDatabaseShutdownExecutor': org.h2.jdbc.JdbcSQLNonTransientConnectionException: Database is already closed (to disable automatic closing at VM shutdown, add ";DB_CLOSE_ON_EXIT=FALSE" to the db URL) [90121-199]
2019-06-18 23:08:00.421  INFO 89571 --- [       Thread-4] com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Shutdown initiated...
2019-06-18 23:08:00.424  INFO 89571 --- [       Thread-4] com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Shutdown completed.

When I use the code above, it will show that info in the log, so how can I prevent it from restarting after job completion, I tried to but the code inside a different thread but still the same issue persists.

MRK
  • 27
  • 1
  • 7

0 Answers0