Unable to execute spark job using SparkSubmitOperator

不想你离开。 提交于 2019-12-03 07:33:53

You can either create a new connection using the Airflow Web UI or change the spark-default connection.

Master can be local, yarn, spark://HOST:PORT, mesos://HOST:PORT and k8s://https://<HOST>:<PORT>.

You can also supply the following commands in the extras:

{"queue": "root.default", "deploy_mode": "cluster", "spark_home": "", "spark_binary": "spark-submit", "namespace": "default"}

Either the "spark-submit" binary should be in the PATH or the spark-home is set in the extra on the connection.

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!