Can SPARK use multicore properly?

左心房为你撑大大i 提交于 2019-12-05 20:11:41

问题


I've read about spark and I found out that spark is written in scala. Since scala is a functional language, like erlang, it can use multi core properly. Is that correct?

I'm wondering if I can use spark in distributed system which have multicore processors. Can a single task use all cores at the same time? I've read YARN will assign different cores on each different tasks, but in this case, it is a single task

And, is it just enough to use multi thread programming in JAVA (hadoop) to use all cores in each computer? Since linux scheduler is based on threads?


回答1:


Yes, it can, as this is its stated purpose — to split and parallelize what is parallelizeable. You can even specify amount of memory to be used by each executor.

However, some tasks cannot be parallelized, which is why sometimes Spark only occupies one core.

If you use the Spark shell, make sure you set the number of cores to use, as it is said in the answer to this question Why is Spark not using all cores on local machine

Source: official Spark docs https://spark.apache.org/docs/latest/configuration.html




回答2:


No, a single thread can only run on a single core. You'll have to use multiple threads or processes to use more than one core at the same time. Remember that not all tasks can run asynchron in multiple threads.



来源:https://stackoverflow.com/questions/29716949/can-spark-use-multicore-properly

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!