Exception in thread “main” org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243)

后端 未结 3 967
说谎
说谎 2021-01-07 01:58

i am getting an error when i am trying to run a spark application with cassandra.

Exception in thread \"main\" org.apache.spark.SparkException: Only one Spar         


        
3条回答
  •  余生分开走
    2021-01-07 02:44

    You can only have one SparkContext at a time and since a StreamingContext has a SparkContext in it you can't have a separate Streaming and Spark Context in the same code. What you can do is build a StreamingContext off of your SparkContext so you can have access to both if you really need that.

    Use this constructor JavaStreamingContext(sparkContext: JavaSparkContext, batchDuration: Duration)

提交回复
热议问题