问题
I am using python to get datas from redis and then parser it to kafka. It works well in most situations.
But when I use python to simulate build the data to redis, or there was a fast put-in datas in queuen, I can't get all the data.
Here is my code about redis producer to simulate build 20000 datas to redis:
rc = redis.Redis(host='127.0.0.1', port=6379)
rc.ping()
ps = rc.pubsub()
ps.subscribe('bdwaf')
r_str = "--8198b507-A--\n[22/Jun/2017:14:13:19 +0800]ucTcxcMcicAcAcAcicAcAcAm 192.168.1.189 50054 127.0.0.1 80\n"
for i in range(0, 20000):
rc.publish('bdwaf', r_str)
and the redis consumer also the kafka producer is:
rc = redis.Redis(host='localhost', port=6379)
rc.ping()
ps = rc.pubsub()
ps.subscribe('bdwaf')
num = 0
for item in ps.listen():
if item['type'] == 'message':
num += 1
a.parser(item['data'])
print num
It only prints out like 4000 datas.
If I comment the a.parser(item['data'])
, it can print out all datas num.
Or wirte sleep(0.001)
in the redis producer, it can print out all datas num too.
What is wrong with my code?
回答1:
I assume you are using redis-py.
The documentation refers to listen
as older version of the lib... Maybe you should use another method for message reading. For example with a callback
p = r.pubsub()
def my_handler(message):
print 'MY HANDLER: ', message['data']
if item['type'] == 'message':
num += 1
a.parser(item['data'])
print num
p.subscribe('bdwaf', my_handler)
# read the subscribe confirmation message
p.get_message()
Edit:
It is possible that your redis server is running out of memory when you publish 20000 messages at once. Try increasing redis memory in redis.conf
file
maxmemory 500mb # or greater if needed
It is a memory problem, check out this question for more information on how to handle it.
来源:https://stackoverflow.com/questions/44777455/python-redis-subscribe-can-not-get-all-datas