Why persist () are lazily evaluated in Spark

前端 未结 3 1464
谎友^
谎友^ 2020-12-14 22:07

I understood the point that in Scala there are 2 types of operations

  1. Transformations
  2. Actions

Transformations like map(), filter() are

3条回答
  •  春和景丽
    2020-12-14 22:42

    If you have some data that you might or might not use, making persist() eager would be inefficient. A normal Spark transformation corresponds to a def in Scala. A persist turns it into a lazy val.

提交回复
热议问题