Spark Kill Running Application

后端 未结 5 555
北恋
北恋 2020-12-22 16:45

I have a running Spark application where it occupies all the cores where my other applications won\'t be allocated any resource.

I did some quick research and peopl

5条回答
  •  予麋鹿
    予麋鹿 (楼主)
    2020-12-22 17:27

    https://hadoop.apache.org/docs/stable/hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html#Cluster_Application_State_API

    PUT http://{rm http address:port}/ws/v1/cluster/apps/{appid}/state

    {
      "state":"KILLED"
    }
    

提交回复
热议问题