spring-cloud-dataflow

How can Spring Cloud Dataflow Server use new tables( with custom prefix ) created for Spring batch and Spring cloud task?

雨燕双飞 提交于 2021-02-19 07:47:07
问题 I have created spring cloud task tables i.e. TASK_EXECUTION , TASK_TASK_BATCH with prefix as MYTASK_ and spring batch Tables with prefix MYBATCH_ in oracle database. There are default tables also there in the same schema which got created automatically or by other team mate. I have bound my Oracle database service to SCDF server deployed on PCF . How can i tell my Spring Cloud Dataflow server to use tables created with my prefix to render data on dataflow server dashboard? Currently, SCDF

spring cloud dataflow on kubernetes error on deploy

江枫思渺然 提交于 2021-02-11 15:48:05
问题 I am working on spring cloud dataflow stream app. I am able to run Spring cloud data flow server with the skipper running in Cloud Foundry . Now i am trying to run the same with the skipper running in kubernetes cluster and getting below error on deployment even though i am explicitly giving the username in the environment config in deployment. Caused by: io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: GET at: kubernetes_cluster_url:6443/api/v1/namespaces/pocdev

Monitoring custom Stream apps in Spring Cloud data flow

笑着哭i 提交于 2021-02-11 09:56:03
问题 I am trying scdf and its monitoring with prometheus and grafana. I followed the documentation available and able to deploy the sample stream and able to see the metrics in the grafana. I have created a stream with some custom stream app (other than the supplied rabbit mq starter apps). Stream: htt | participant | log But am not able see the "participant" application metrics in gafana. But able to see the metrics of http and log apps. Added below properties in application.properties.

Monitoring custom Stream apps in Spring Cloud data flow

匆匆过客 提交于 2021-02-11 09:54:25
问题 I am trying scdf and its monitoring with prometheus and grafana. I followed the documentation available and able to deploy the sample stream and able to see the metrics in the grafana. I have created a stream with some custom stream app (other than the supplied rabbit mq starter apps). Stream: htt | participant | log But am not able see the "participant" application metrics in gafana. But able to see the metrics of http and log apps. Added below properties in application.properties.

Where to set env variables for local Spring Cloud Dataflow?

杀马特。学长 韩版系。学妹 提交于 2021-02-10 11:51:52
问题 For development, I'm using the local Spring Cloud Dataflow server on my Mac, though we plan to deploy to a Kubernetes cluster for integration testing and production. The SCDF docs say you can use environment variables to configure various things, like database configuration. I'd like my registered app to use these env variables, but they don't seem to be able to see them. That is, I start the SCDF server by running its jar from a terminal window, which can see a set of environment variables.

Spring Cloud Data Flow Grafana Prometheus not showing stream data

江枫思渺然 提交于 2021-02-08 15:12:38
问题 I launch Spring cloud data flow with docker-compose base on this website. https://dataflow.spring.io/docs/installation/local/docker/ I created 3 apps, Source, Processor & Sink. I ran export STREAM_APPS_URI=https://dataflow.spring.io/Einstein-BUILD-SNAPSHOT-stream-applications-kafka-maven When I run docker-compose -f ./docker-compose.yml -f ./docker-compose-prometheus.yml up, all my containers start up as specified in the docker-compose.yml and docker-compose-prometheus.yml. I proceed to

Spring Cloud Data Flow Grafana Prometheus not showing stream data

情到浓时终转凉″ 提交于 2021-02-08 15:12:06
问题 I launch Spring cloud data flow with docker-compose base on this website. https://dataflow.spring.io/docs/installation/local/docker/ I created 3 apps, Source, Processor & Sink. I ran export STREAM_APPS_URI=https://dataflow.spring.io/Einstein-BUILD-SNAPSHOT-stream-applications-kafka-maven When I run docker-compose -f ./docker-compose.yml -f ./docker-compose-prometheus.yml up, all my containers start up as specified in the docker-compose.yml and docker-compose-prometheus.yml. I proceed to

Dockerfile that is used for building spring-cloud-dataflow-server image

一世执手 提交于 2021-02-08 06:32:51
问题 I have downloaded the Spring cloud Dataflow server code from GitHub at https://github.com/spring-cloud/spring-cloud-dataflow. I am trying to understand how the docker image is build for this server. But I am unable to find dockerfile in this codebase. Reference documentation section "Adding a Custom JDBC Driver" calls for modifying Pom.xml and rebuild with very little information. I need to use a custom jar and rebuild the image. Already looked into this post https://github.com/spring-cloud

Registering Custom Spring Cloud Task with Spring Cloud Data Flow

痞子三分冷 提交于 2021-02-08 05:02:31
问题 I'm getting started with Spring Cloud Data Flow and want to implement a simple Spring Cloud Task I want to use with it. I created a hello world example from the documentation. When I run it in my IDE it executes without any problems and prints 'hello world'. It is using the following JDBC connection: o.s.j.datasource.SimpleDriverDataSource : Creating new JDBC Driver Connection to [jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=false] I use the dockerized Local Spring Data Flow Server

Problem synchronizing 'bucket' to local directory with Spring Cloud DataFlow Streams

不打扰是莪最后的温柔 提交于 2021-01-29 09:28:41
问题 I'm following this Case Study, which is similar to mine where I want to receive thousand of files in a S3 bucket and launch the batch task which will consume them. But I'm getting: Problem occurred while synchronizing 'bucket' to local directory; nested exception is org.springframework.messaging.MessagingException: Failed to execute on session; nested exception is com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied;