spring-cloud-task

How can Spring Cloud Dataflow Server use new tables( with custom prefix ) created for Spring batch and Spring cloud task?

雨燕双飞 提交于 2021-02-19 07:47:07
问题 I have created spring cloud task tables i.e. TASK_EXECUTION , TASK_TASK_BATCH with prefix as MYTASK_ and spring batch Tables with prefix MYBATCH_ in oracle database. There are default tables also there in the same schema which got created automatically or by other team mate. I have bound my Oracle database service to SCDF server deployed on PCF . How can i tell my Spring Cloud Dataflow server to use tables created with my prefix to render data on dataflow server dashboard? Currently, SCDF

Registering Custom Spring Cloud Task with Spring Cloud Data Flow

痞子三分冷 提交于 2021-02-08 05:02:31
问题 I'm getting started with Spring Cloud Data Flow and want to implement a simple Spring Cloud Task I want to use with it. I created a hello world example from the documentation. When I run it in my IDE it executes without any problems and prints 'hello world'. It is using the following JDBC connection: o.s.j.datasource.SimpleDriverDataSource : Creating new JDBC Driver Connection to [jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=false] I use the dockerized Local Spring Data Flow Server

Dynamically change maxWorkers with DeployerPartitionHandler

我们两清 提交于 2021-01-29 13:32:25
问题 Based on no. of partitions returned can the maxWorkers changed dynamically(at runtime) when using DeployerPartitionHandler Regards, Balu UPDATE Please find my use case. The batch execution starts for a normal business day with maxWorkers as "4" and partition returned as 40 but suddenly the load increases and that particular run of the batch returned 4 times the normal partitions (160). Here how do I increase the maxWorkers (say 16). Also if I start the batch always with a very high limit for

spring-data-flow task example

*爱你&永不变心* 提交于 2021-01-28 01:43:07
问题 I'm using spring-cloud-dataflow with taskcloud module but I've some trouble to lunch a simple example in container. tiny example 6.3 writing code then I've deploy it but when I try to execute it throw me an java.lang.IllegalArgumentException: Invalid TaskExecution, ID 1 not found at org.springframework.util.Assert.notNull(Assert.java:134) at org.springframework.cloud.task.listener.TaskLifecycleListener.doTaskStart(TaskLifecycleListener.java:200) In my evaluation I've used Spring boot example

spring Batch flow job Vs spring composed task

江枫思渺然 提交于 2021-01-21 08:57:25
问题 I want to execute my apps using spring-complex-task and I have already build complex spring-batch Flow Jobs which executes perfectly fine. could you please explain what is difference between spring Batch flow job Vs spring composed task? and which is best among them? 回答1: A composed task within Spring Cloud Data Flow is actually built on Spring Batch in that the transition from task to task is managed by a dynamically generated Spring Batch job. This model allows the decomposition of a batch

Difference between Spring Cloud Task and Spring Batch?

こ雲淡風輕ζ 提交于 2020-08-06 06:33:04
问题 I went through the Introducing Spring Cloud Task, but things are not clear for the following questions. I'm using Spring Batch 1) What's the use of Spring Cloud Task when we already have the metadata provided by Spring Batch ? 2) We're planning to use Spring Cloud Data Flow to monitor the Spring Batch. All the batch jobs can be imported into the SCDF as task and can be sceduled there, but don't see support for MongoDB. Hope MySQL works well. What is the difference between Spring Cloud Task

A job instance already exists and is complete for parameters={-spring.cloud.task.executionid=2}. If you want to run this job again, change the parame

≡放荡痞女 提交于 2020-05-28 08:31:45
问题 I am working on Spring Cloud Data Flow and Spring Batch by taking a reference from https://github.com/spring-cloud/spring-cloud-task/tree/master/spring-cloud-task-samples. I'm executing the batch-job and when executed this example two times, on 2nd time I observed the error, however for the first time it worked fine. I started the spring-cloud-dataflow-server-local server using below commands and it created all metadata for me- highlighted in yellow. Error: exitCode=null, taskName='batch-job'