How to deploy Spark application jar file to Kubernetes cluster?

混江龙づ霸主 提交于 2021-01-20 04:49:07

问题


I am currently trying to deploy a spark example jar on a Kubernetes cluster running on IBM Cloud.

If I try to follow these instructions to deploy spark on a kubernetes cluster, I am not able to launch Spark Pi, because I am always getting the error message:

The system cannot find the file specified

after entering the code

bin/spark-submit \
    --master k8s://<url of my kubernetes cluster> \
    --deploy-mode cluster \
    --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=5 \
    --conf spark.kubernetes.container.image=<spark-image> \
    local:///examples/jars/spark-examples_2.11-2.3.0.jar

I am in the right directory with the spark-examples_2.11-2.3.0.jar file in the examples/jars directory.


回答1:


Ensure your.jar file is present inside the container image.

Instruction tells that it should be there:

Finally, notice that in the above example we specify a jar with a specific URI with a scheme of local://. This URI is the location of the example jar that is already in the Docker image.

In other words, local:// scheme is removed from local:///examples/jars/spark-examples_2.11-2.3.0.jar and the path /examples/jars/spark-examples_2.11-2.3.0.jar is expected to be available in a container image.




回答2:


Please make sure this absolute path /examples/jars/spark-examples_2.11-2.3.0.jar is exists.

Or you are trying loading a jar file in current directory, In this case it should be an relative path like local://./examples/jars/spark-examples_2.11-2.3.0.jar.

I'm not sure if spark-submit accepts relative path or not.



来源:https://stackoverflow.com/questions/50337296/how-to-deploy-spark-application-jar-file-to-kubernetes-cluster

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!