I launch pyspark applications from pycharm on my own workstation, to a 8 node cluster. This cluster also has settings encoded in spark-defaults.conf and spark-env.sh
<
you can also update the log level programmatically like below, get hold of spark object from JVM and do like below
def update_spark_log_level(self, log_level='info'):
self.spark.sparkContext.setLogLevel(log_level)
log4j = self.spark._jvm.org.apache.log4j
logger = log4j.LogManager.getLogger("my custom Log Level")
return logger;
use:
logger = update_spark_log_level('debug')
logger.info('you log message')
feel free to comment if you need more details