zeromq

STORM ERROR java.lang.UnsatisfiedLinkError?

拟墨画扇 提交于 2019-12-04 20:18:12
Compile no problem, but after run..... 26183 [Thread-34] ERROR backtype.storm.util - Async loop died! java.lang.UnsatisfiedLinkError: org.zeromq.ZMQ$Socket.finalize()V at org.zeromq.ZMQ$Socket.finalize(Native Method) at org.zeromq.ZMQ$Socket.close(ZMQ.java:339) at storm.starter.spout.RandomSentenceSpout.nextTuple(RandomSentenceSpout.java:56) at backtype.storm.daemon.executor$fn__3985$fn__3997$fn__4026.invoke(executor.clj:502) at backtype.storm.util$async_loop$fn__465.invoke(util.clj:377) at clojure.lang.AFn.run(AFn.java:24) at java.lang.Thread.run(Thread.java:724) 26185 [Thread-34] ERROR

Python 3.6 ZeroMQ (PyZMQ) asyncio pub sub Hello World

十年热恋 提交于 2019-12-04 19:22:25
I've just started with ZeroMQ and I'm trying to get a Hello World to work with PyZMQ and asyncio in Python 3.6. I'm trying to de-couple the functionality of a module with the pub/sub code, hence the following class setup: Edit 1 : Minimized example Edit 2 : Included solution, see answer down for how. import asyncio import zmq.asyncio from zmq.asyncio import Context # manages message flow between publishers and subscribers class HelloWorldMessage: def __init__(self, url='127.0.0.1', port='5555'): self.url = "tcp://{}:{}".format(url, port) self.ctx = Context.instance() # activate publishers /

How can I use send_json with pyzmq PUB SUB

风格不统一 提交于 2019-12-04 18:34:39
I need to send a dictionary as the message from a publisher to subscribers. With the REQ/REP pattern send_json and recv_json work nicely, but I can't seem to find an incantation that works for PUB/SUB. Hope it's not the case that PUB/SUB can only work with send() and recv(). Here's the listing for the experiment I put together: """ Experiments with 0MQ PUB/SUB pattern """ import os import sys import time import zmq from multiprocessing import Process from random import sample, choice import signal def handler(signum, frame): """ Handler for SIGTERM """ # kill the processes we've launched try:

How to identify the physical address of incoming connections in NetMQ?

混江龙づ霸主 提交于 2019-12-04 18:18:48
In the router-dealer example for NetMQ , we see that clients can set their own identity. This can be useful for logging purposes. However, what if I only control code for the server (router) and not the code for the clients (dealers)? What if some clients don't bother to set the identity in a way that is meaningful for my server? How do I include the physical address in my logs, when the client doesn't specifically give it to me in the message or identity? If you only know the answer based on some other implementation of ZeroMQ, I will be interested to hear it, but ultimately I want something

How to have limited ZMQ (ZeroMQ - PyZMQ) queue buffer size in python?

拟墨画扇 提交于 2019-12-04 17:01:16
I use pyzmq library with pub/sub in python . I have some rapid ZMQ publisher by .connect() method script and a slower ZMQ subscriber by .bind() method script. Then after few minutes my subscriber gets old published data from publishers ( due ZMQ buffer ). My Question: Is there an approach to manage ZMQ queue buffer size? (set a limited buffer) Note : I don't want to use ZMQ PUSH/PULL. Note : I've read this post, but this approach clear buffer only: clear ZMQ buffer Note : I tried with high water mark options too, but it didn't worked: socket.setsockopt(zmq.RCVHWM, 10) # not working socket

Can't get ZeroMQ python bindings to receive messages over IPC

荒凉一梦 提交于 2019-12-04 16:36:49
问题 I'm trying to achieve PUB/SUB over IPC. If I changed the code below so that the subscriber binds to "tcp://*:5000" and the publisher connects to "tcp://localhost:5000" it works, but I can't get it to work over IPC. What am I doing wrong? subscriber.py import zmq, json def main(): context = zmq.Context() subscriber = context.socket(zmq.SUB) subscriber.bind("ipc://test") subscriber.setsockopt(zmq.SUBSCRIBE, '') while True: print subscriber.recv() if __name__ == "__main__": main() publisher.py

How to synchronize the publishers and subscribers in extended PUB-SUB pattern with Intermediary in ZeroMQ in c++?

℡╲_俬逩灬. 提交于 2019-12-04 16:26:41
Extended PUB/SUB topology I have multiple publishers and multiple subscribers in a use case with 1 intermediary. In the ZeroMQ guide, I learnt about synchronizing 1 publisher and 1 subscriber, using additional REQ/REP sockets. I tried to write a synchronization code for my use case, but it is getting messy if I try to write code according to logic given for 1-1 PUB/SUB . The publisher code when we have only 1 publisher is : //Socket to receive sync request zmq::socket_t syncservice (context, ZMQ_REP); syncservice.bind("tcp://*:5562"); // Get synchronization from subscribers int subscribers = 0

zeromq php extension for windows

社会主义新天地 提交于 2019-12-04 15:59:34
I am using Zend server configured with IIS 7.5. I searched for edit: zeromq php extension, I found these http://valokuva.org/builds/ and http://snapshot.zero.mq/ I have tried to add extension in php.ini and when I enable it form zend admin it shows an error "The system could not load this extension" and in logs "PHP Startup: Unable to load dynamic library 'C:\Program Files\Zend\ZendServer\lib\phpext\php_zmq.dll' - The specified module could not be found. in Unknown on line 0". I have tried to build extension (.dll) from source but it also didn't work. On http://www.zeromq.org/bindings:php its

ImportError: cannot import name constants

非 Y 不嫁゛ 提交于 2019-12-04 14:36:37
I'm trying to run a simple piece of code using pyzmq. I am using Python 2.7 and pyzmq 14.5 $ python --version Python 2.7.6 $ sudo find /usr -name "*pyzmq*" /usr/local/lib/python2.7/dist-packages/pyzmq-14.5.0.egg-info /usr/lib/python2.7/dist-packages/pyzmq-14.0.1.egg-info Following is the code i'm trying to run: import zhelpers context = zmq.Context.instance() server = context.socket(zmq.ROUTER) server.bind("tcp://*:5678") while (1): address, empty, data = server.recv_multipart() print("address = %s, data = %d" % (address, int(data))) data_i = int(data) + 10 server.send_multipart([ address, b''

Python Multi-Processing Question?

限于喜欢 提交于 2019-12-04 13:53:41
问题 I have a folder with 500 input files (total size of all files is ~ 500[MB]). I'd like to write a python script that does the following: (1) load all of the input files to memory (2) initializes an empty python list that will later be used ... see bullet (4) (3) start 15 different (independent) processes: each of these uses the same input data [from (1) ] -- yet uses a different algorithms to processes it, thus generating different results (4) I'd like all the independent processes [from step