Is there better way to display entire Spark SQL DataFrame?
问题 I would like to display the entire Apache Spark SQL DataFrame with the Scala API. I can use the show() method: myDataFrame.show(Int.MaxValue) Is there a better way to display an entire DataFrame than using Int.MaxValue ? 回答1: It is generally not advisable to display an entire DataFrame to stdout, because that means you need to pull the entire DataFrame (all of its values) to the driver (unless DataFrame is already local, which you can check with df.isLocal ). Unless you know ahead of time