i am getting an error when i am trying to run a spark application with cassandra.
Exception in thread \"main\" org.apache.spark.SparkException: Only one Spar
You can only have one SparkContext at a time and since a StreamingContext has a SparkContext in it you can't have a separate Streaming and Spark Context in the same code. What you can do is build a StreamingContext off of your SparkContext so you can have access to both if you really need that.
Use this constructor
JavaStreamingContext(sparkContext: JavaSparkContext, batchDuration: Duration)