问题
I am running pyspark application through Jupyter notebook. I can kill a job using Spark Web UI, but I want to kill it programmatically.
How can I kill it ???
回答1:
Suppose that you wrote this code:
from pyspark import SparkContext
sc = SparkContext("local", "Simple App")
# This will stop your app
sc.stop()
As descibes in the docs: http://spark.apache.org/docs/latest/api/python/pyspark.html?highlight=stop#pyspark.SparkContext.stop
来源:https://stackoverflow.com/questions/43236330/kill-spark-job-programmatically