How to create multiple SparkContexts in a console

后端 未结 4 719
温柔的废话
温柔的废话 2020-12-10 07:24

I want to create more than one SparkContext in a console. According to a post in mailing list, I need to do SparkConf.set( \'spark.driver.allowMultipleContexts\' , true), it

相关标签:
4条回答
  • 2020-12-10 07:36

    I was hoping previous spark context be stopped and closed by calling close() stop() and the new one can be recreated, but still getting same error.

    0 讨论(0)
  • 2020-12-10 07:37

    Run the bellow function before creating a new context

    def kill_current_spark_context():
        SparkContext.getOrCreate().stop()
    
    0 讨论(0)
  • 2020-12-10 07:59

    My way:

    from pyspark import SparkContext
    try:
        sc.stop()
    except:
        pass
    sc=SparkContext('local','pyspark')
    '''
    your code
    '''
    sc.stop()
    
    0 讨论(0)
  • 2020-12-10 08:00

    This is a PySpark-specific limitation that existed before the spark.driver.allowMultipleContexts configuration was added (which relates to multiple SparkContext objects within a JVM). PySpark disallows multiple active SparkContexts because various parts of its implementation assume that certain components have global shared state.

    0 讨论(0)
提交回复
热议问题