Is there any way to get the current number of partitions of a DataFrame? I checked the DataFrame javadoc (spark 1.6) and didn\'t found a method for that, or am I just missed
One more Interesting way to get number of partitions is 'using mapPartitions' transformation. Sample Code -
val x = (1 to 10).toList val numberDF = x.toDF() numberDF.rdd.mapPartitions(x => Iterator[Int](1)).sum()
Spark experts are welcome to comment on its performance.