Kill Spark Job programmatically

笑着哭i 提交于 2019-12-14 04:26:36

问题


I am running pyspark application through Jupyter notebook. I can kill a job using Spark Web UI, but I want to kill it programmatically.

How can I kill it ???


回答1:


Suppose that you wrote this code:

from pyspark import SparkContext

sc = SparkContext("local", "Simple App")

# This will stop your app
sc.stop()

As descibes in the docs: http://spark.apache.org/docs/latest/api/python/pyspark.html?highlight=stop#pyspark.SparkContext.stop



来源:https://stackoverflow.com/questions/43236330/kill-spark-job-programmatically

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!