How can set the default spark logging level?

后端 未结 3 863
Happy的楠姐
Happy的楠姐 2020-12-15 23:56

I launch pyspark applications from pycharm on my own workstation, to a 8 node cluster. This cluster also has settings encoded in spark-defaults.conf and spark-env.sh

<
3条回答
  •  感情败类
    2020-12-16 00:09

    you can also update the log level programmatically like below, get hold of spark object from JVM and do like below

        def update_spark_log_level(self, log_level='info'):
            self.spark.sparkContext.setLogLevel(log_level)
            log4j = self.spark._jvm.org.apache.log4j
            logger = log4j.LogManager.getLogger("my custom Log Level")
            return logger;
    
    
    use:
    
    logger = update_spark_log_level('debug')
    logger.info('you log message')
    

    feel free to comment if you need more details

提交回复
热议问题