Is there any way to get the current number of partitions of a DataFrame?
I checked the DataFrame javadoc (spark 1.6) and didn\'t found a method for that, or am I just missed
You need to call getNumPartitions() on the DataFrame's underlying RDD, e.g., df.rdd.getNumPartitions(). In the case of Scala, this is a parameterless method: df.rdd.getNumPartitions.