问题
I tried spring.batch.job.enabled=false in application.properties and -Dspring.batch.job.enabled=false when running the jar file.
However @EnableBatchProcessing automatically start running the batch jobs on application start. How i can debug such scenario?
TestConfiguration.class
@Configuration
@EnableBatchProcessing
public class TestConfiguration {...}
MainApplication
@ComponentScan("com.demo")
@EnableAutoConfiguration
public class MainApplication {
public static void main(String[] args) throws BeansException, JobExecutionAlreadyRunningException, JobInstanceAlreadyCompleteException, JobParametersInvalidException, InterruptedException, JobRestartException {
ConfigurableApplicationContext ctx = SpringApplication.run(TestConfiguration.class, args);
...}
pom.xml I am using dependency as spring boot and not as parent
<dependencyManagement>
<dependencies>
<!-- Import dependecy for spring boot from here-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.2.4.RELEASE</version>
<type>pom</type>
<scope>import</scope>
</dependency>
回答1:
I was able to know whats going on, I am using custom reader/processor/writer. When springboot application starts it actually try to do dependency injection of this custom beans beans where I have written some application logic.
Example
** TestConfiguration.class**
@Configuration
@EnableBatchProcessing
public class TestConfiguration {
@Bean
@Conditional(Employee.class)
public ItemWriter<Employee> writer_employee(DataSource dataSource) throws IOException {
FlatFileItemWriter<Employee> writer = new FlatFileItemWriter<Employee>();
writer.setResource(new FileSystemResource(FinanceReportUtil.createFile("Employee.csv")));
writer.setHeaderCallback(new FlatFileHeaderCallback() {
@Override
public void writeHeader(Writer writer) throws IOException {
writer.write("id, name");
}
});
DelimitedLineAggregator<Employee> delLineAgg = new DelimitedLineAggregator<Employee>();
delLineAgg.setDelimiter(",");
BeanWrapperFieldExtractor<Employee> fieldExtractor = new BeanWrapperFieldExtractor<Employee>();
fieldExtractor.setNames(new String[]{"id", "name"});
delLineAgg.setFieldExtractor(fieldExtractor);
writer.setLineAggregator(delLineAgg);
return writer;
}
@Bean
@Conditional(Manager.class)
public ItemWriter<Person> writer_manager(DataSource dataSource) throws IOException {
// Does the same logic as employee
}
// Also has job and step etc.
}
It will create the file even with spring.batch.job.enabled=false, to overcome this I have created custom logic to inject the beans or not as below
application.properties
# all, manager, employee
person=manager
ManagerCondition.class
public class ManagerCondition implements Condition {
@Override
public boolean matches(ConditionContext context, AnnotatedTypeMetadata metadata) {
String person= context.getEnvironment().getProperty("person");
return person.equals("manager");
}
回答2:
I also faced the same issue, the property 'spring.batch.job.enabled=false' was not recognising at start up when we give this in properties file. It could be because the properties might not have loaded into context before the batch initiated.
So i have set the property 'spring.batch.job.enabled=false' in standalone.xml as a system property like below.
<system-properties>
<property name="spring.batch.job.enabled" value="false"/>
</system-properties>
With this it SUCCESSFULLY worked & the spring batch jobs did not initialised on server start up.
Please note that the system-properties must be placed right after the extensions tag in standalone.xml.
来源:https://stackoverflow.com/questions/31276011/spring-boot-spring-batch-job-enabled-false-not-able-to-recognize