How to force DataFrame evaluation in Spark

后端 未结 4 864
感情败类
感情败类 2020-11-28 15:27

Sometimes (e.g. for testing and bechmarking) I want force the execution of the transformations defined on a DataFrame. AFAIK calling an action like count does n

4条回答
  •  抹茶落季
    2020-11-28 15:46

    I guess simply getting an underlying rdd from DataFrame and triggering an action on it should achieve what you're looking for.

    df.withColumn("test",myUDF($"id")).rdd.count // this gives proper exceptions
    

提交回复
热议问题