spark-submit to cloudera cluster can not find any dependent jars

北慕城南 提交于 2019-12-12 02:04:09

问题


I am able to do a spark-submit to my cloudera cluster. the job dies after a few minutes with exceptions complaining it can not find various classes. These are classes that are in the spark dependency path. I keep adding the jars one at a time using command line args --jars, the yarn log keeps dumping out the next jar it can't find.

What setting allows the spark/yarn job to find all the dependent jars?

I already set the "spark.home" attribute to the correct path - /opt/cloudera/parcels/CDH/lib/spark


回答1:


I found it!

remove

.set("spark.driver.host", "driver computer ip address")

from your driver code.



来源:https://stackoverflow.com/questions/25495661/spark-submit-to-cloudera-cluster-can-not-find-any-dependent-jars

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!