Spark : check your cluster UI to ensure that workers are registered

前端 未结 5 675
广开言路
广开言路 2021-01-07 23:55

I have a simple program in Spark:

/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.S         


        
5条回答
  •  天命终不由人
    2021-01-08 00:09

    I have done configuration and performance tuning for many spark clusters and this is a very common/normal message to see when you are first prepping/configuring a cluster to handle your workloads.

    This is unequivocally due to insufficient resources to have the job launched. The job is requesting one of:

    • more memory per worker than allocated to it (1GB)
    • more CPU's than available on the cluster

提交回复
热议问题