redis-cluster

Redis filling up memory fast, running --bigkeys free it up

不问归期 提交于 2021-02-10 17:01:28
问题 A month ago, out of the blue, Redis started to fill up the server memory fast. In order to debug the problem we have run redis-cli --bigkeys and for our surprise all the used memory was freed. We have a cluster of 6 nodes, being 3 masters and 3 slaves, each of the masters databases are around 15GB. Each of the nodes are stored in a dedicated box with 64GB each. Redis is filling the entire memory of 64GB twice a day. We have a cron running redis-cli --bigkeys twice a day to free up the used

Redis command to get all available keys on Redis Cluster?

ぐ巨炮叔叔 提交于 2021-02-07 04:56:23
问题 I am using this redisManager.redisClient.keys('*example*', function (err, keys) { }) But it only gives keys from only one of the redis cluster. How can I get keys from all cluster? 回答1: You can't get keys for all nodes using a single command. You have to get keys for all nodes and merge them. Reference - https://github.com/antirez/redis/issues/1962 You can do something like. var redis = require('redis'); redisConfig = new Array( {"port": 1234, "host": "192.168.1.2"}, {"port": 5678, "host":

redis-cluster redeploy cluster fail via kubernetes

被刻印的时光 ゝ 提交于 2021-01-29 07:50:52
问题 i have use kubernetes statefulset object to create redis-cluster that cluster is fine first then i delete the statefulset object redeploy but type command "cluster nodes" to check cluster show cluster fail. redis-cluster.yaml https://github.com/JayChanggithub/k8s-redis-cluster.git initial redis-cluster the cluster nodes is workable # the first time deploy redis-cluster $ kubectl apply -f redis-cluster.yaml # sts objects $ kubectl get sts -n kube-ops redis-app 6/6 6m58s # create the cluster

Is there a way to auto discover new cluster node IP in Redis Cluster with Lettuce

好久不见. 提交于 2021-01-29 06:58:30
问题 I have a Redis Cluster (3 master and 3 slaves) running inside a Kubernetes cluster. The cluster is exposed via a Kubenetes-Service (Kube-Service) . I have my application server connected to the Redis Cluster (using the Kube-Service as the URI) via the Lettuce java client for Redis. I also have the following client options set on the Lettuce connection object: ClusterTopologyRefreshOptions topologyRefreshOptions = ClusterTopologyRefreshOptions.builder() .enablePeriodicRefresh(Duration

Is there a way to auto discover new cluster node IP in Redis Cluster with Lettuce

爱⌒轻易说出口 提交于 2021-01-29 06:56:49
问题 I have a Redis Cluster (3 master and 3 slaves) running inside a Kubernetes cluster. The cluster is exposed via a Kubenetes-Service (Kube-Service) . I have my application server connected to the Redis Cluster (using the Kube-Service as the URI) via the Lettuce java client for Redis. I also have the following client options set on the Lettuce connection object: ClusterTopologyRefreshOptions topologyRefreshOptions = ClusterTopologyRefreshOptions.builder() .enablePeriodicRefresh(Duration

How to do Redis Data encryption?

匆匆过客 提交于 2020-08-05 02:30:09
问题 We can secure the data while its travelling using spiped or stunnel . But How do we do that while the data at rest? What if someone took the whole database? How can we encrypt the persistent data storage? Do we need to do this in application layer? 回答1: When looking at the documentation at https://redis.io/topics/security it is clear that encryption of data at rest isn't supported: Redis is designed to be accessed by trusted clients inside trusted environments. This means that usually it is

Is there a way to flushall on a cluster so all keys from master and slaves are deleted from the db

左心房为你撑大大i 提交于 2020-06-29 06:44:12
问题 From the documentation this seems how flushall would work but in practice it is not working that way. When I use the command flushall it only flushes the keys from the db instance the cli is assigned to. Redis flushall documentation Delete all the keys of all the existing databases, not just the currently selected one. This command never fails. The time-complexity for this operation is O(N), N being the number of keys in all existing databases. For example if my cluster redis-cli has started

Redis pub sub max subscribers and publishers

余生颓废 提交于 2020-06-17 01:56:39
问题 Could anyone tell me whats the maximum number of concurrent channels Redis pub-sub can support?. Is there any cap to the number of subscribers and publishers 回答1: Redis uses a dict , same struct as for keys, to store channel subscriptions, both per client and for all clients (keeps a per-subscription hash with a list of clients subscribed), so it is up to 2^32 channel subscriptions in total. It uses a list to store pattern subscriptions per client, so it is theoretically limited only by the