reduceByKey method not being found in Scala Spark

后端 未结 3 1476
抹茶落季
抹茶落季 2020-12-18 18:34

Attempting to run http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala from source.

This line:



        
相关标签:
3条回答
  • 2020-12-18 18:54

    If you use maven on ScalaIDE I just solved the problem by updating the dependency from spark-streaming version 1.2 to version 1.3.

    0 讨论(0)
  • 2020-12-18 18:54

    Actually, you can find it in PairRDDFunctions class. PairRDDFunctions is a class contains extra functions available on RDDs of (key, value) pairs through an implicit conversion.

    https://spark.apache.org/docs/2.1.0/api/scala/index.html#org.apache.spark.rdd.PairRDDFunctions

    0 讨论(0)
  • 2020-12-18 19:00

    You should import the implicit conversions from SparkContext:

    import org.apache.spark.SparkContext._
    

    They use the 'pimp up my library' pattern to add methods to RDD's of specific types. If curious, see SparkContext:1296

    0 讨论(0)
提交回复
热议问题