Spark Kill Running Application

后端 未结 5 551
北恋
北恋 2020-12-22 16:45

I have a running Spark application where it occupies all the cores where my other applications won\'t be allocated any resource.

I did some quick research and peopl

5条回答
  •  再見小時候
    2020-12-22 17:27

    It may be time consuming to get all the application Ids from YARN and kill them one by one. You can use a Bash for loop to accomplish this repetitive task quickly and more efficiently as shown below:

    Kill all applications on YARN which are in ACCEPTED state:

    for x in $(yarn application -list -appStates ACCEPTED | awk 'NR > 2 { print $1 }'); do yarn application -kill $x; done

    Kill all applications on YARN which are in RUNNING state:

    for x in $(yarn application -list -appStates RUNNING | awk 'NR > 2 { print $1 }'); do yarn application -kill $x; done

提交回复
热议问题