Spark - Error “A master URL must be set in your configuration” when submitting an app

后端 未结 16 1987
爱一瞬间的悲伤
爱一瞬间的悲伤 2020-12-02 07:31

I have an Spark app which runs with no problem in local mode,but have some problems when submitting to the Spark cluster.

The error msg are as follows:



        
16条回答
  •  被撕碎了的回忆
    2020-12-02 08:14

    Tried this option in learning Spark processing with setting up Spark context in local machine. Requisite 1)Keep Spark sessionr running in local 2)Add Spark maven dependency 3)Keep the input file at root\input folder 4)output will be placed at \output folder. Getting max share value for year. down load any CSV from yahoo finance https://in.finance.yahoo.com/quote/CAPPL.BO/history/ Maven dependency and Scala code below -

    
            
                org.apache.spark
                spark-core_2.11
                2.4.3
                provided
            
           
    
    object MaxEquityPriceForYear {
      def main(args: Array[String]) {
        val sparkConf = new SparkConf().setAppName("ShareMaxPrice").setMaster("local[2]").set("spark.executor.memory", "1g");
        val sc = new SparkContext(sparkConf);
        val input = "./input/CAPPL.BO.csv"
        val output = "./output"
        sc.textFile(input)
          .map(_.split(","))
          .map(rec => ((rec(0).split("-"))(0).toInt, rec(1).toFloat))
          .reduceByKey((a, b) => Math.max(a, b))
          .saveAsTextFile(output)
      }
    

提交回复
热议问题