I want to create more than one SparkContext in a console. According to a post in mailing list, I need to do SparkConf.set( \'spark.driver.allowMultipleContexts\' , true), it
This is a PySpark-specific limitation that existed before the spark.driver.allowMultipleContexts
configuration was added (which relates to multiple SparkContext objects within a JVM). PySpark disallows multiple active SparkContexts because various parts of its implementation assume that certain components have global shared state.