using apache oozie ssh actions to execute spark-submit, why does the spark application is stuck on state accepted

最后都变了- 提交于 2019-12-11 10:46:17

问题


I am trying to run several spark applications one after one and scheduling them using oozie. I used an ssh action, that should run spark-submit on my spark application. When simple running the spark application from the server the application starts running, however, when i used the oozie ssh action to run the spark-submit, i could see a new spark application appeared, but the state of the application is stuck on "ACCEPTED" and never actually starts running.

My ssh action looks like that :

<host> ${user}@${host}</host>
<command>spark-submit</command>
<args>--driver-memory</args>
<args>6G</args>
<args>--deploy-mode</args>
<args>cluster</args>
<args>--class</args>
<args>main.Application</args>
<args>--executor-memory</args>
<args>6G</args>
<args>--master</args>
<args>yarn-cluster</args>
<args>duplicate.jar</args>
... [the jar's arguments]
<capture-output />

Am I using the oozie ssh action correctly?

Has anyone tried doing what I'm trying to do?

Does anybody have an idea why the application never starts running?

来源:https://stackoverflow.com/questions/29098841/using-apache-oozie-ssh-actions-to-execute-spark-submit-why-does-the-spark-appli

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!