spring-batch

Spring Batch: How to process multi-line log files

徘徊边缘 提交于 2019-12-06 06:58:11
问题 I am trying to import the contents of a log file into a database using Spring Batch. I am currently using a FlatFileItemReader, but there are unfortunately many log entries that doesn't catch. The two main problems are: Lines that contain multi-line JSON Strings: 2012-03-22 11:47:35,307 DEBUG main someMethod(SomeClass.java:56): Do Something(18,true,null,null,null): my.json = '{ "Foo":"FooValue", "Bar":"BarValue", ... etc }' Lines that contain stack traces 2012-03-22 11:47:50,596 ERROR main

Processing millions of database records in Java [closed]

雨燕双飞 提交于 2019-12-06 06:46:31
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance. Closed 7 years ago . I have a requirement to write a batch job that fetches rows from a database table and based on a certain conditions, write to other tables or update this row with a certain value. We are using spring and jdbc to fetch the result set and

Spring Batch - create a new file each time instead of overriding it for transferring data from CSV to XML

狂风中的少年 提交于 2019-12-06 06:12:05
I am new to Spring Batch . I was trying to shift data from CSV file to XML file & able to shift it successfully. But when each time I run the code my XML (output file) getting override which I dont want, instead I want to create new output file (old output files should be there, require for data tracking purpose) for each run. How can I do that ? Here is the my code: What I need to change in below file? Let me know if you need more file code from my side. <beans xmlns="http://www.springframework.org/schema/beans" xmlns:batch="http://www.springframework.org/schema/batch" xmlns:xsi="http://www

Spring batch 2.1.8 Run a single instance of a job, when a quartz cron trigger fires

做~自己de王妃 提交于 2019-12-06 05:09:47
I have a spring batch quartz set up. I have two jobs configured and running in parallel. The jobs are reading a file and then writing some data to the db. Here is the tricky part, the file names are calculated within the beforeJob() method of my execution listener. After each job finishes, the afterJob() would then calculate the next fileName. File names have the following pattern xxxxxx.nnnn where nn.. are numbers and sometimes the sequence can have numbers missing, therefore I am attempting "jump" over those missing passages and when I find an existing number to launch the job. I want to

Recommended approach for parallel spring batch jobs

守給你的承諾、 提交于 2019-12-06 04:29:05
The Spring Batch Integration documentation explains how to use remote chunking and partitioning for steps, see http://docs.spring.io/spring-batch/trunk/reference/html/springBatchIntegration.html#externalizing-batch-process-execution Our jobs do not consist of straightforward reader/processor/writer steps. So we want to simply have whole jobs running in parallel, with each job being farmed out to different partitions. Is there already a pattern for this in Spring Batch? Or would I need to implement my own JobLauncher to maintain a pool of slaves to launch jobs on? Cheers, Menno Spring Batch

Spring Batch Java Config transaction-attributes equivalent

杀马特。学长 韩版系。学妹 提交于 2019-12-06 04:10:33
In Spring batch you can set the transaction isolation and propagation like this: <job id="someJob" xmlns="http://www.springframework.org/schema/batch"> <step id="readWriteDate"> <tasklet transaction-manager="transactionManager"> <transaction-attributes isolation="DEFAULT" propagation="REQUIRED" timeout="30"/> <chunk reader="dbItemReader" processor="dbItemProcessor" writer="dbItemWriter" commit-interval="2" /> </tasklet> </step> </job> I cant find the java config equivalent. well it is there @Configuration public class StepWithTx { @Autowired private StepBuilderFactory steps; @Bean public Step

Spring Batch Partitioning inject stepExecutionContext parameter in itemReader

…衆ロ難τιáo~ 提交于 2019-12-06 03:13:50
I am trying to learn Spring Batch with Partitioner. The issue is that I need to set the filenames dynamically from the Partitioner implementation. And I am trying to get it in the itemReader . But it gives filename null . My Spring Batch configuration: @Bean @StepScope public ItemReader<Transaction> itemReader(@Value("#{stepExecutionContext[filename]}") String filename) throws UnexpectedInputException, ParseException { FlatFileItemReader<Transaction> reader = new FlatFileItemReader<Transaction>(); DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer(); String[] tokens = { "username",

Skip header, body and footer lines from file on Spring Batch

青春壹個敷衍的年華 提交于 2019-12-06 03:07:41
I have this specifically file: H;COD;CREATION_DATE;TOT_POR;TYPE H;001;2013-10-30;20;R D;DETAIL_VALUE;PROP_VALUE D;003;3030 D;002;3031 D;005;3032 T;NUM_FOL;TOT T;1;503.45 As you can see, it has header/body/footer lines. I'm looking for a ItemReader that skip these lines. I've done this ItemReader below who identify those lines, using PatternMatchingCompositeLineMapper . <bean id="fileReader" class="org.springframework.batch.item.file.FlatFileItemReader"> <property name="resource" ref="myFileReference" /> <property name="lineMapper"> <bean class="org.springframework.batch.item.file.mapping

update on duplicate key with JdbcBatchItemWriter

匆匆过客 提交于 2019-12-06 02:10:17
spring Batch for my project, and I'm simply trying to read from a csv file and load data to a database, using JdbcBatchItemWriter as writer. I'm looking for a way to tell the writer to insert a new row, but, on duplicate key (or duplicate unique identifier) update row instead of failing. I know I can do that directly in the sql statement, but that would be specific to Mysql, though I want my code to be DBMS-independent. here is my writer declaration in the java config @Bean @StepScope public ItemWriter<Person> writerHeadingCollectionsToDb(DataSource datasource) { String sqlStatement = "INSERT

How to read and process multiple files concurrently in spring?

徘徊边缘 提交于 2019-12-06 02:05:43
I am new to Spring framework and I am doing one simple project using spring and got stuck in between. In my project I am reading the file from directory using spring poller. And then processing that file through various channels and sending it to the queue. But problem is that "file-inbound-channel-adapter" (which I'm using ) is reading only one file at a time. So I need a solution which will read and process multiple files at a time. Is there any way to implement multithreading in spring integration. Thank you. Add a task-executor to the poller; see the documentation . You can control the