What is the relationship between workers, worker instances, and executors?

前端 未结 4 1912
余生分开走
余生分开走 2020-12-07 08:44

In Spark Standalone mode, there are master and worker nodes.

Here are few questions:

  1. Does 2 worker instance mean one worker node with 2
4条回答
  •  孤街浪徒
    2020-12-07 09:22

    I suggest reading the Spark cluster docs first, but even more so this Cloudera blog post explaining these modes.

    Your first question depends on what you mean by 'instances'. A node is a machine, and there's not a good reason to run more than one worker per machine. So two worker nodes typically means two machines, each a Spark worker.

    Workers hold many executors, for many applications. One application has executors on many workers.

    Your third question is not clear.

提交回复
热议问题