I want to create more than one SparkContext in a console. According to a post in mailing list, I need to do SparkConf.set( \'spark.driver.allowMultipleContexts\' , true), it
I was hoping previous spark context be stopped and closed by calling close() stop() and the new one can be recreated, but still getting same error.
Run the bellow function before creating a new context
def kill_current_spark_context():
SparkContext.getOrCreate().stop()
My way:
from pyspark import SparkContext
try:
sc.stop()
except:
pass
sc=SparkContext('local','pyspark')
'''
your code
'''
sc.stop()
This is a PySpark-specific limitation that existed before the spark.driver.allowMultipleContexts
configuration was added (which relates to multiple SparkContext objects within a JVM). PySpark disallows multiple active SparkContexts because various parts of its implementation assume that certain components have global shared state.