Write data to Redis from PySpark

后端 未结 1 601
刺人心
刺人心 2020-12-16 06:56

In Scala, we would write an RDD to Redis like this:

datardd.foreachPartition(iter => {
      val r = new RedisClient(\"hosturl\", 6379)
      iter.foreach         


        
相关标签:
1条回答
  • 2020-12-16 07:35

    PySpark's SparkContext has a addPyFile method specifically for this thing. Make the redis module a zip file (like this) and just call this method:

    sc = SparkContext(appName = "analyze")
    sc.addPyFile("/path/to/redis.zip")
    
    0 讨论(0)
提交回复
热议问题