spark 2.1.0 session config settings (pyspark)

前端 未结 5 1119
情深已故
情深已故 2020-12-12 16:27

I am trying to overwrite the spark session/spark context default configs, but it is picking entire node/cluster resource.

 spark  = SparkSession.builder
            


        
5条回答
  •  南方客
    南方客 (楼主)
    2020-12-12 16:48

    update configuration in Spark 2.3.1

    To change the default spark configurations you can follow these steps:

    Import the required classes

    from pyspark.conf import SparkConf
    from pyspark.sql import SparkSession
    

    Get the default configurations

    spark.sparkContext._conf.getAll()
    

    Update the default configurations

    conf = spark.sparkContext._conf.setAll([('spark.executor.memory', '4g'), ('spark.app.name', 'Spark Updated Conf'), ('spark.executor.cores', '4'), ('spark.cores.max', '4'), ('spark.driver.memory','4g')])
    

    Stop the current Spark Session

    spark.sparkContext.stop()
    

    Create a Spark Session

    spark = SparkSession.builder.config(conf=conf).getOrCreate()
    

提交回复
热议问题