launching a spark program using oozie workflow

て烟熏妆下的殇ゞ 提交于 2019-11-27 16:47:57

问题


I am working with a scala program using spark packages. Currently I run the program using the bash command from the gateway: /homes/spark/bin/spark-submit --master yarn-cluster --class "com.xxx.yyy.zzz" --driver-java-options "-Dyyy.num=5" a.jar arg1 arg2

I would like to start using oozie for running this job. I have a few setbacks:

Where should I put the spark-submit executable? on the hfs? How do I define the spark action? where should the --driver-java-options appear? How should the oozie action look like? is it similar to the one appearing here?


回答1:


If you have a new enough version of oozie you can use oozie's spark task:

https://github.com/apache/oozie/blob/master/client/src/main/resources/spark-action-0.1.xsd

Otherwise you need to execute a java task that will call spark. Something like:

   <java>
        <main-class>org.apache.spark.deploy.SparkSubmit</main-class>

        <arg>--class</arg>
        <arg>${spark_main_class}</arg> -> this is the class com.xxx.yyy.zzz

        <arg>--deploy-mode</arg>
        <arg>cluster</arg>

        <arg>--master</arg>
        <arg>yarn</arg>

        <arg>--queue</arg>
        <arg>${queue_name}</arg> -> depends on your oozie config

        <arg>--num-executors</arg>
        <arg>${spark_num_executors}</arg>

        <arg>--executor-cores</arg>
        <arg>${spark_executor_cores}</arg>

        <arg>${spark_app_file}</arg> -> jar that contains your spark job, written in scala

        <arg>${input}</arg> -> some arg 
        <arg>${output}</arg>-> some other arg

        <file>${spark_app_file}</file>

        <file>${name_node}/user/spark/share/lib/spark-assembly.jar</file>
    </java>


来源:https://stackoverflow.com/questions/29233487/launching-a-spark-program-using-oozie-workflow

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!