aggregator

Spring Integration Java DSL — Configuration of aggregator

喜欢而已 提交于 2019-12-12 11:09:09
问题 I have a very simple integration flow, where a RESTful request is forwarded to two providers using a publish-subscribe channel. The result from both RESTful services is then aggregated in a single array. The sketch of the integration flow is as shown below: @Bean IntegrationFlow flow() throws Exception { return IntegrationFlows.from("inputChannel") .publishSubscribeChannel(s -> s.applySequence(true) .subscribe(f -> f .handle(Http.outboundGateway("http://provider1.com/...") .httpMethod

Reading RSS Feeds: What Aggregators Do That I'm Not

痞子三分冷 提交于 2019-12-12 01:29:57
问题 I drop the following feed into Google Reader, and it update normally. http://www.indeed.ca/rss?q=&l=Hamilton%2C+ON However, when I use any of a number of approaches suggested thither and yon on the 'net that simply involve reading from this source and parsing the XML I receive the same 20 items. What is Google Reader doing that I should be in my code so that I receive new items? Thanks for your advice. Incidentally, I'm coding in Python. 回答1: RSS aggregators "poll" the sources, i.e., they

Apache Camel: GroupedExchangeAggregationStrategy groups DefaultExchange instead of message body

守給你的承諾、 提交于 2019-12-11 14:25:42
问题 In continuation to the other thread, Apache Camel : File to BeanIO and merge beanIO objects based on id Trying to group the EmployeeDetails using GroupedExchangeAggregationStrategy as below from("seda:aggregate").aggregate(simple("${body.id}"), new MergeAggregationStrategy()).completionSize(3).log("Details - ${header.details}").to("seda:formList"); from("seda:formList").aggregate(new GroupedExchangeAggregationStrategy()).constant(true).completionTimeout(10) .process

WSo2 Esb filtering messages to an output file Part 2

僤鯓⒐⒋嵵緔 提交于 2019-12-11 11:13:53
问题 Ok, this is a continuation to my original question. (WSo2 Esb filtering messages to an output file) After a couple more days of research into the iterate and aggregate mediator I have reached a progress wall and would be very grateful for any advice how to resolve my blocking issue. The task at hand is simple, read and xml file, filter on certain records, and only produce an output xml file with just those records that meet the criteria. After previous discussion and research this tasks

Group Timeout does not work as expected in Spring Aggregator

丶灬走出姿态 提交于 2019-12-11 09:58:11
问题 Sample aggregator: <int:aggregator input-channel="msgInput" output-channel="msgOutput" expire-groups-upon-completion="true" group-timeout="1000" expire-groups-upon-timeout="true" send-partial-result-on-expiry="false" ref="msgGroup" /> With sequence size of 2, when I manually iterate the timestamp of each messages grouped with msgGroup, there are still some (not all) messages that is more than 1000ms difference. Is there anything that I missed perhaps? Please note that the correlation ID and

Apache Camel : File to BeanIO and merge beanIO objects based on id

隐身守侯 提交于 2019-12-11 09:47:44
问题 I have the usecase to read employee, address and contact files in parallel and convert that to beanIO object and merge the beanIO object to produce the complete employeeDetails object. Emp File: 1 Foo Engineer 2 Bar AssistantEngineer Emp Contact File: 1 8912345678 foo@org.com 2 7812345678 bar@org.com Emp Address File: 1 city1 1234 2 city2 2345 Expected output in a EmployeeDetailsBeanIODataFormat object in Exchange: 1 Foo Engineer foo@org.com city1 1234 2 Bar AssistantEngineer bar@org.com

Aggregate list based on size

守給你的承諾、 提交于 2019-12-11 04:00:00
问题 I have a list of let's say size 10, I want to aggregate using max size 6. In this case it should work like this: first six messages are aggregated into one message and then right away (without any timeout) the next 4 messages are aggregated into second message. How can I achieve this in spring integration? I tried using releaseStrategy but I can only define in it the max size, and then the messages that are left (4 messages in my case) are waiting in the aggregator for more messages (so the

Spring integration deadlock using Aggregator + MessageStoreReaper + Redis?

旧城冷巷雨未停 提交于 2019-12-11 02:19:56
问题 This question is related to this post in SI forum, but as the forum is closed, I post it here to continue the thread: http://forum.spring.io/forum/spring-projects/integration/748192-messages-not-flowing-when-using-jms-channels To sum up, I have an aggregator with a Redis message store and a reaper scheduled every 60 secs. Messages are sent to the aggregator using a JMS-Channel. Here's the config: <bean id="jedisPoolConfigBean" class="redis.clients.jedis.JedisPoolConfig"> <property name=

Deadlock using Aggregator + Redis

北战南征 提交于 2019-12-02 14:05:18
问题 This post is related to this one Spring integration deadlock using Aggregator + MessageStoreReaper + Redis? but this message is too long to post. I continue with the original post I upgraded to to the latest Java 7 build 1.7.0_60-b19 but the problem is still there. I made another thread dump and found the same issue: all DefaultMessageListenerContainers (count 20) are locked by the taskScheduler (entityScheduler-3) in the AbstractCorrelatingMessageHandler lock invocation. Here's the scheduler

Deadlock using Aggregator + Redis

吃可爱长大的小学妹 提交于 2019-12-02 04:13:08
This post is related to this one Spring integration deadlock using Aggregator + MessageStoreReaper + Redis? but this message is too long to post. I continue with the original post I upgraded to to the latest Java 7 build 1.7.0_60-b19 but the problem is still there. I made another thread dump and found the same issue: all DefaultMessageListenerContainers (count 20) are locked by the taskScheduler (entityScheduler-3) in the AbstractCorrelatingMessageHandler lock invocation. Here's the scheduler and aggregator config: <task:scheduled-tasks scheduler="entityScheduler"> <task:scheduled ref=