Count number of rows in an RDD

前端 未结 2 526
走了就别回头了
走了就别回头了 2020-12-08 11:05

I\'m using spark with java, and i hava an RDD of 5 millions rows. Is there a sollution that allows me to calculate the number of rows of my RDD. I\'ve tried RDD.count(

相关标签:
2条回答
  • 2020-12-08 11:24

    You had the right idea: use rdd.count() to count the number of rows. There is no faster way.

    I think the question you should have asked is why is rdd.count() so slow?

    The answer is that rdd.count() is an "action" — it is an eager operation, because it has to return an actual number. The RDD operations you've performed before count() were "transformations" — they transformed an RDD into another lazily. In effect the transformations were not actually performed, just queued up. When you call count(), you force all the previous lazy operations to be performed. The input files need to be loaded now, map()s and filter()s executed, shuffles performed, etc, until finally we have the data and can say how many rows it has.

    Note that if you call count() twice, all this will happen twice. After the count is returned, all the data is discarded! If you want to avoid this, call cache() on the RDD. Then the second call to count() will be fast and also derived RDDs will be faster to calculate. However, in this case the RDD will have to be stored in memory (or disk).

    0 讨论(0)
  • 2020-12-08 11:31

    Daniel's explanation of count is right on the money. If you are willing to accept an approximation, though, you could try the countApprox(timeout: Long, confidence: Double = 0.95): PartialResult[BoundedDouble] RDD method. (Note, though, that this is tagged as "Experimental").

    0 讨论(0)
提交回复
热议问题