Why does vcore always equal the number of nodes in Spark on YARN?

前端 未结 3 1099
醉话见心
醉话见心 2020-12-28 15:58

I have a Hadoop cluster with 5 nodes, each of which has 12 cores with 32GB memory. I use YARN as MapReduce framework, so I have the following settings with YARN:

    <
3条回答
  •  半阙折子戏
    2020-12-28 16:49

    Without setting the YARN scheduler to FairScheduler, I saw the same thing. The Spark UI showed the right number of tasks, though, suggesting nothing was wrong. My cluster showed close to 100% CPU usage, which confirmed this.

    After setting FairScheduler, the YARN Resources looked correct.

提交回复
热议问题