spark 2.1.0 session config settings (pyspark)

前端 未结 5 1123
情深已故
情深已故 2020-12-12 16:27

I am trying to overwrite the spark session/spark context default configs, but it is picking entire node/cluster resource.

 spark  = SparkSession.builder
            


        
5条回答
  •  隐瞒了意图╮
    2020-12-12 16:48

    Setting 'spark.driver.host' to 'localhost' in the config works for me

    spark = SparkSession \
        .builder \
        .appName("MyApp") \
        .config("spark.driver.host", "localhost") \
        .getOrCreate()
    

提交回复
热议问题