spring-integration

Spring Integration with Jackson ObjectMapper and Java 8 Time (JSR-310)

╄→尐↘猪︶ㄣ 提交于 2019-12-20 03:17:15
问题 I am struggling with configuring a "custom" ObjectMapper to be used by the Spring Integration DSL transformers. I receive an java.time.Instant json representations that I would like to parse to object properties. i.e: {"type": "TEST", "source":"TEST", "timestamp":{"epochSecond": 1454503381, "nano": 335000000}} The message is a kafka message which raises a question: Should I write a custom serializer implementing Kafka encoders/decoders in order to be able to transform the kafka message to the

Spring Cloud Dataflow Type conversion not working in processor component?

依然范特西╮ 提交于 2019-12-20 02:55:06
问题 I have a processor which transforms byte[] payloads into MyClass payloads: @Slf4j @EnableBinding(Processor.class) public class MyDecoder { @ServiceActivator(inputChannel = Processor.INPUT, outputChannel = Processor.OUTPUT) public MyClass decode(final byte[] payload) { MyClass decoded = doStuff(payload); if (decoded != null) { log.info("Successfully decoded!"); } return decoded; } } I tried creating the following DSL : some-source | my-decoder | some-sink and some-sink reports errors because

Spring Integration DSL ErrorHandling

这一生的挚爱 提交于 2019-12-20 02:43:08
问题 As the title says, I'm looking for a good example on errorHandling within a DSL flow. Specifically, I'm looking to handle errors from a service activator. Example: IntegrationFlows.from(Amqp.inboundAdapter(simpleMessageListenerContainer())) .transform(new JsonToObjectTransformer(AlbumDescriptor.class)) .handle(AlbumDescriptor.class, (p,h) -> transformXml(p)) .transform(new ObjectToJsonTransformer()) .handle(Amqp.outboundAdapter(rabbitTemplate).routingKey("xml-transformed")) .get(); If my

Spring Cloud Stream and @Publisher annotation compatiblity

我是研究僧i 提交于 2019-12-20 02:40:09
问题 Since Spring Cloud Stream has not an annotation for sending a new message to a stream (@SendTo only works when @StreamListener is declared), I tried to use Spring Integration annotation for that purpose, that is @Publisher. Because @Publisher takes a channel and @EnableBinding annotations of Spring Cloud Stream can bind an output channel using @Output annotation, I tried to mix them in the following way: @EnableBinding(MessageSource.class) @Service public class ExampleService { @Publisher

Error handling in Spring integration flow async

放肆的年华 提交于 2019-12-20 01:58:09
问题 I have the following Spring Integration configuration that allows me to call a gateway method from MVC Controller and letting controller return, while integration flow will continue on its own in a separate thread that does not block controller. However, I cannot figure out how to get my error handler to work for this async flow. My gateway has error channel defined, but my exceptions do not reach it for some reason. Instead, I see that LoggingHandler gets invoked. @Bean IntegrationFlow

How to create Spring Integration Flow from two MessageProducerSpec?

 ̄綄美尐妖づ 提交于 2019-12-20 01:53:05
问题 I am using Spring Integration, Java DSL (release 1.1.3) I have my org.springframework.integration.dsl.IntegrationFlow defined as follows return IntegrationFlows.from(messageProducerSpec) .handle(handler) .handle(aggregator) .handle(endpoint) .get(); } messageProducerSpec is instance of org.springframework.integration.dsl.amqp.AmqpBaseInboundChannelAdapterSpec I would like my integration flow to consume messages from TWO separate messageProducerSpecs (two separate

Spring Batch - Reading multiple line log message

情到浓时终转凉″ 提交于 2019-12-19 11:24:50
问题 I am facing a problem to read multi-line log message as a single message in our spring batch application configured with spring integration, this application has to read multiline log message (example exception stack trace) as a single message, later it has to process and classify the message for further indexing. Each line is identified by its timestamp (pattern mentioned above i.e. DATE_PATTERN) and it may continue mutltiple lines, I am trying to continue reading a message until I see

Spring Batch - Reading multiple line log message

一个人想着一个人 提交于 2019-12-19 11:22:28
问题 I am facing a problem to read multi-line log message as a single message in our spring batch application configured with spring integration, this application has to read multiline log message (example exception stack trace) as a single message, later it has to process and classify the message for further indexing. Each line is identified by its timestamp (pattern mentioned above i.e. DATE_PATTERN) and it may continue mutltiple lines, I am trying to continue reading a message until I see

How polling works for FTP inbound channel adapter considering max-message-per poll and Cron

六月ゝ 毕业季﹏ 提交于 2019-12-19 05:08:43
问题 I have UC where I need to pick the files from ftp location and place it into the server location I am using ftp-inbound-channel-adapter (Spring integration - 2.0.4) for achieving it . Below is the configuration in my xml <bean id="ftpAASessionFactory" class="org.springframework.integration.ftp.session.DefaultFtpSessionFactory"> <property name="host" value="${ftp.session.host}" /> <property name="port" value="${ftp.session.port}" /> <property name="username" value="${ftp.session.username}" />

Spring Integration - Externalizing JDBC Queries

孤者浪人 提交于 2019-12-18 09:12:39
问题 Is there a simple way to externalize big sql queries from jdbc outbound gateways, instead of inlining it? The reason being that we're having to many big queries to make, and we'd like to have them on their own files, or at least externalize them in beans. Some caveats: I don't have control over the database, so I can't create anything there (e.g. stored procedures) I don't want to create classes just for this matter, I just want to organize/refactor it a bit, and not make it more complex