Is there better way to display entire Spark SQL DataFrame?

泄露秘密 提交于 2020-12-27 07:57:05

问题


I would like to display the entire Apache Spark SQL DataFrame with the Scala API. I can use the show() method:

myDataFrame.show(Int.MaxValue)

Is there a better way to display an entire DataFrame than using Int.MaxValue?


回答1:


It is generally not advisable to display an entire DataFrame to stdout, because that means you need to pull the entire DataFrame (all of its values) to the driver (unless DataFrame is already local, which you can check with df.isLocal).

Unless you know ahead of time that the size of your dataset is sufficiently small so that driver JVM process has enough memory available to accommodate all values, it is not safe to do this. That's why DataFrame API's show() by default shows you only the first 20 rows.

You could use the df.collect which returns Array[T] and then iterate over each line and print it:

df.collect.foreach(println)

but you lose all formatting implemented in df.showString(numRows: Int) (that show() internally uses).

So no, I guess there is no better way.




回答2:


One way is using count() function to get the total number of records and use show(rdd.count()) .




回答3:


Try with,

df.show(35, false)

It will display 35 rows and 35 column values with full values name.




回答4:


As others suggested, printing out entire DF is bad idea. However, you can use df.rdd.foreachPartition(f) to print out partition-by-partition without flooding driver JVM (y using collect)




回答5:


Nothing more succinct than that, but if you want to avoid the Int.MaxValue, then you could use a collect and process it, or foreach. But, for a tabular format without much manual code, show is the best you can do.




回答6:


In java I have tried it with two ways. This is working perfectly for me:

1.

data.show(SomeNo);

2.

data.foreach(new ForeachFunction<Row>() {
                public void call(Row arg0) throws Exception {
                    System.out.println(arg0);
                }
            });



回答7:


I've tried show() and it seems working sometimes. But sometimes not working, just give it a try:

println(df.show())


来源:https://stackoverflow.com/questions/30264373/is-there-better-way-to-display-entire-spark-sql-dataframe

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!