PySpark SparkSession Builder with Kubernetes Master
问题 I recently saw a pull request that was merged to the Apache/Spark repository that apparently adds initial Python bindings for PySpark on K8s. I posted a comment to the PR asking a question about how to use spark-on-k8s in a Python Jupyter notebook, and was told to ask my question here. My question is: Is there a way to create SparkContexts using PySpark's SparkSession.Builder with master set to k8s://<...>:<...> , and have the resulting jobs run on spark-on-k8s , instead of on local ? E.g.: