Apache Spark and Remote Method Invocation

只谈情不闲聊 提交于 2019-12-22 13:58:33

问题


I am trying to understand how Apache Spark works behind the scenes. After coding a little in Spark I am pretty quite sure that it implements the RDD as RMI Remote objects, doesn't it?

In this way, it can modify them inside transformation, such as maps, flatMaps, and so on. Object that are not part of an RDD are simply serialized and sent to a worker during execution.

In the example below, lines and tokenswill be treated as remote objects, while the string toFind will be simply serialized and copied to the workers.

val lines: RDD[String] = sc.textFile("large_file.txt")
val toFind = "Some cool string"
val tokens = 
  lines.flatMap(_ split " ")
       .filter(_.contains(toFind))

Am I wrong? I googled a little but I've not found any reference to how Spark RDD are internally implemented.


回答1:


You are correct. Spark serializes closures to perform remote method invocation.



来源:https://stackoverflow.com/questions/36461299/apache-spark-and-remote-method-invocation

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!