Spark 2.0: Redefining SparkSession params through GetOrCreate and NOT seeing changes in WebUI

断了今生、忘了曾经 提交于 2019-11-27 22:20:23

问题


I'm using Spark 2.0 with PySpark.

I am redefining SparkSession parameters through a GetOrCreate method that was introduced in 2.0:

This method first checks whether there is a valid global default SparkSession, and if yes, return that one. If no valid global default SparkSession exists, the method creates a new SparkSession and assigns the newly created SparkSession as the global default.

In case an existing SparkSession is returned, the config options specified in this builder will be applied to the existing SparkSession.

https://spark.apache.org/docs/2.0.1/api/python/pyspark.sql.html#pyspark.sql.SparkSession.Builder.getOrCreate

So far so good:

from pyspark import SparkConf

SparkConf().toDebugString()
'spark.app.name=pyspark-shell\nspark.master=local[2]\nspark.submit.deployMode=client'

spark.conf.get("spark.app.name")
'pyspark-shell'

Then I redefine SparkSession config with the promise to see the changes in WebUI

appName(name)
Sets a name for the application, which will be shown in the Spark web UI.

https://spark.apache.org/docs/2.0.1/api/python/pyspark.sql.html#pyspark.sql.SparkSession.Builder.appName

c = SparkConf()
(c
 .setAppName("MyApp")
 .setMaster("local")
 .set("spark.driver.memory","1g")
 )

from pyspark.sql import SparkSession
(SparkSession
.builder
.enableHiveSupport() # metastore, serdes, Hive udf
.config(conf=c)
.getOrCreate())

spark.conf.get("spark.app.name")
'MyApp'

Now, when I go to localhost:4040, I would expect to see MyApp as an app name.

However, I still see pyspark-shell application UI

Where am I wrong?

Thanks in advance!


回答1:


I believe that documentation is a bit misleading here and when you work with Scala you actually see a warning like this:

... WARN SparkSession$Builder: Use an existing SparkSession, some configuration may not take effect.

It was more obvious prior to Spark 2.0 with clear separation between contexts:

  • SparkContext configuration cannot be modified on runtime. You have to stop existing context first.
  • SQLContext configuration can be modified on runtime.

spark.app.name, like many other options, is bound to SparkContext, and cannot be modified without stopping the context.

Reusing existing SparkContext / SparkSession

import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession

spark.conf.get("spark.sql.shuffle.partitions")
String = 200
val conf = new SparkConf()
  .setAppName("foo")
  .set("spark.sql.shuffle.partitions", "2001")

val spark = SparkSession.builder.config(conf).getOrCreate()
... WARN SparkSession$Builder: Use an existing SparkSession ...
spark: org.apache.spark.sql.SparkSession =  ...
spark.conf.get("spark.sql.shuffle.partitions")
String = 2001

While spark.app.name config is updated:

spark.conf.get("spark.app.name")
String = foo

it doesn't affect SparkContext:

spark.sparkContext.appName
String = Spark shell

Stopping existing SparkContext / SparkSession

Now let's stop the session and repeat the process:

spark.stop
val spark = SparkSession.builder.config(conf).getOrCreate()
...  WARN SparkContext: Use an existing SparkContext ...
spark: org.apache.spark.sql.SparkSession = ...
spark.sparkContext.appName
String = foo

Interestingly when we stop the session we still get a warning about using existing SparkContext, but you can check it is actually stopped.



来源:https://stackoverflow.com/questions/40701518/spark-2-0-redefining-sparksession-params-through-getorcreate-and-not-seeing-cha

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!