Is there a possibility to parallelize function with pyspark?

前端 未结 0 1993
梦谈多话
梦谈多话 2020-12-08 16:22

I want using pyspark to parallelize python methods for example with the map function. Is there an possibility that each RDD are getting exactly one map function?

For

相关标签:
回答
  • 消灭零回复
提交回复
热议问题