message-queue

Can I call TranslateMessage inside the message callback?

我的未来我决定 提交于 2019-12-12 03:57:33
问题 I don't have the canonical message loop running, so is there a way I can call TranslateMessage (or its equivalent) inside my message proc handler? Basically I need WM_CHAR messages and unless I can call TranslateMessage I'm not going to get those. Currently I have the message proc setup, but no message loop. // Static window function called by Windows for message processing. Implementation // delegates message processing to MemberWndProc. LRESULT CALLBACK FxPlayerTiny::WindowsMsgStatic(HWND

Multiple node.js message receivers for a redis queue

允我心安 提交于 2019-12-12 02:28:03
问题 Trying to implement queue processor(s) in node.js application using rsmq-worker (https://github.com/mpneuried/rsmq-worker). Have to run multiple workers to process messages in parallel. How do we setup multiple workers? Exploring on using - https://github.com/Automattic/kue#parallel-processing-with-cluster 回答1: Simply create multiple instances of the RSMQWorker pointing to the same queue. Example: var workers = []; for (let i = 0; i < NUMBER_OF_WORKERS; i++) { workers[i] = createWorker(i); }

JMS-Websocket - delayed message delivery

人走茶凉 提交于 2019-12-12 01:13:00
问题 This application receives & forwards messages from database events to client applications. Messages are immediately delivered when the client browser has a web socket session. However, when no web socket session exists and a message is sent by the JMSProducer into the Destination "jms/notificationQueue" in QueueSenderSessionBean, the message is immediately consumed in NotificationEndpoint. This is not my intent. My intent is for the queue to retain the message until the user connects to

Kafka consumer losing state of messages after shutdown

送分小仙女□ 提交于 2019-12-11 23:21:41
问题 Thanks for taking time to answer the question. I am using kafka with a python consumer. Everything works great when the consumer is up and running and messages get pushed to kafka which are then read by the consumer. However, if the consumer goes down for whatever reason, when it comes back up, it only reads the NEW messages that are posted to kafka after the consumer is back up. The messages between shutdown-poweron are lost, that is, the consumer does not read these messages after it comes

POSIX Message Queue is supported on which linux Kernel

谁说胖子不能爱 提交于 2019-12-11 23:18:51
问题 I could successfully implement POSIX Message Queue on Ubuntu 10.04 (Kernel version 2.6.38). But the code fails when (built and) run on same version of Ubuntu 10.04 (Kernel version 2.6.37) on ARM Processor (Thin client devices like HP T410). The failure happens to use any of the Message Queue functions (e.g. mq_open, unlink_message_queue()) : OSError: [Errno 38] Function not implemented Online information shows that. POSIX MQ is supported from Linux Kernel version 2.6.6. This is very confusing

RabbitMQ - strange synchronization behavior

不打扰是莪最后的温柔 提交于 2019-12-11 19:43:06
问题 I have simple RabbitMQ cluster with 2 physical identical linux nodes: (CentOS, RabbitMQ 3.1.5, Erlang R15B, 2GB Ram, CPU 1xCore). Mirroring and synchronization of nodes is turned on. I have two problems which bothers me: In a normal situation everything is fine, but after restarting one of the nodes(by stop_app and start_app in the commandline) the whole cluster becomes unavaible to producers and consumers - I can't produce or receive messages from a queue during synchronization. Is this

How to push messages from Activemq to consumer

此生再无相见时 提交于 2019-12-11 19:25:22
问题 I am new to Activemq and Java ,I read tutorials,somewhat I understand.can anyone help me to solve the following task. Imagine we have a 10 messages in Queue/Topic of Activemq. we are getting messages from Database,we already did it. I want to write 2 Java Applications (using JMS for receiving messages from activemq ) that will act as a consumer in Activemq. What i want to achieve out of this is that whenever Activemq get messages from Database, activemq should check if any consumer is free or

Passing non PODs [ Plain old DataTypes ] over IPC

元气小坏坏 提交于 2019-12-11 19:07:42
问题 I am writing an implementation for doing IPC. User do a call, I take all these parameters and pass them onto other process. I have written an automatic code generator for such functions based on logic which works something like this : Take all the parameters and put them inside a structure. Add other information required for IPC. Pass the size and pointer of this struct to POSIX message queue. Data from this address, till the size specified, is read and send to other process. De-construct the

Pushing Router Data to DIstributed Messaging System

别说谁变了你拦得住时间么 提交于 2019-12-11 19:01:22
问题 Query: Making an interface of router as the producer of kafka cluster. Issue: My router's interface is trying to push the data to the port on which kafka is running. (by default 9092). Q. 1 But can the kafka broker accept this data without a topic being created ? Q. 2 Can a kafka consumer pull data without specifying a topic ? If yes, How ? If not, What is work around this and how can i achieve this ? 1st edit: I just checked that Kafka broker configs have "auto.create.topics.enable" field.

Sidekiq-like queue using java tools?

こ雲淡風輕ζ 提交于 2019-12-11 17:39:13
问题 I want to have a work queue that behaves almost exactly like ruby's sidekiq(it doesn't need to use Redis, but it can - I just can't use ruby - not even Jruby). Basically I want to be able to create jobs that runs with some parameters and a worker pool executes the jobs. The workers are going to use hibernate to do some work, so I think that Spring integration could make things easier. 回答1: Spring Integration has Redis Queue inbound and outbound channel adapters. The inbound message-driven