Spark : check your cluster UI to ensure that workers are registered

前端 未结 5 684
广开言路
广开言路 2021-01-07 23:55

I have a simple program in Spark:

/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.S         


        
5条回答
  •  情书的邮戳
    2021-01-08 00:00

    You can check your cluster's work node cores: your application can't exceed that. For example, you have two work node. And per work node you have 4 cores. Then you have 2 applications to run. So you can give every application 4 cores to run the job.

    You can set like this in the code:

    SparkConf sparkConf = new SparkConf().setAppName("JianSheJieDuan")
                              .set("spark.cores.max", "4");
    

    It works for me.

提交回复
热议问题