How to start multiple spark workers on one machine in Spark 2.4?

前端 未结 0 903
萌比男神i
萌比男神i 2021-02-03 16:57

I am trying to setup a small spark cluster on my local Mac machine, one master and two or more workers. In Spark 2.0.0 doc there is a property SPARK_WORKER_INSTANCES

相关标签:
回答
  • 消灭零回复
提交回复
热议问题