Redis on Spark:Task not serializable

后端 未结 2 1237
既然无缘
既然无缘 2020-12-30 14:52

We use Redis on Spark to cache our key-value pairs.This is the code:

import com.redis.RedisClient
val r = new RedisClient(\"192.168.1.101\", 6379)
val perhit         


        
2条回答
  •  感情败类
    2020-12-30 15:42

    You're trying to serialize the client. You have one RedisClient, r, that you're trying to use inside the map that will be run across different cluster nodes. Either get the data you want out of redis separately before doing a cluster task, or create the client individually for each cluster task inside your map block (perhaps by using mapPartitions rather than map, as creating a new redis client for each individual row is probably a bad idea).

提交回复
热议问题