How to get the number of elements in partition?

北战南征 提交于 2020-01-09 07:15:08

问题


Is there any way to get the number of elements in a spark RDD partition, given the partition ID? Without scanning the entire partition.

Something like this:

Rdd.partitions().get(index).size()

Except I don't see such an API for spark. Any ideas? workarounds?

Thanks


回答1:


The following gives you a new RDD with elements that are the sizes of each partition:

rdd.mapPartitions(iter => Array(iter.size).iterator, true) 



回答2:


PySpark:

num_partitions = 20000
a = sc.parallelize(range(int(1e6)), num_partitions)
l = a.glom().map(len).collect()  # get length of each partition
print(min(l), max(l), sum(l)/len(l), len(l))  # check if skewed

Spark/scala:

val numPartitions = 20000
val a = sc.parallelize(0 until 1e6.toInt, numPartitions )
val l = a.glom().map(_.length).collect()  # get length of each partition
print(l.min, l.max, l.sum/l.length, l.length)  # check if skewed

The same is possible for a dataframe, not just for an RDD. Just add DF.rdd.glom... into the code above.

Notice that glom() converts elements of each partition into a list, so it's memory-intensive. A less memory-intensive version (pyspark version only):

import statistics 

def get_table_partition_distribution(table_name: str):

    def get_partition_len (iterator):
        yield sum(1 for _ in iterator)

    l = spark.table(table_name).rdd.mapPartitions(get_partition_len, True).collect()  # get length of each partition
    num_partitions = len(l)
    min_count = min(l)
    max_count = max(l)
    avg_count = sum(l)/num_partitions
    stddev = statistics.stdev(l)
    print(f"{table_name} each of {num_partitions} partition's counts: min={min_count:,} avg±stddev={avg_count:,.1f} ±{stddev:,.1f} max={max_count:,}")


get_table_partition_distribution('someTable')

outputs something like

someTable each of 1445 partition's counts: min=1,201,201 avg±stddev=1,202,811.6 ±21,783.4 max=2,030,137




回答3:


pzecevic's answer works, but conceptually there's no need to construct an array and then convert it to an iterator. I would just construct the iterator directly and then get the counts with a collect call.

rdd.mapPartitions(iter => Iterator(iter.size), true).collect()

P.S. Not sure if his answer is actually doing more work since Iterator.apply will likely convert its arguments into an array.



来源:https://stackoverflow.com/questions/28687149/how-to-get-the-number-of-elements-in-partition

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!