Unable to connect to Spark master

∥☆過路亽.° 提交于 2019-12-06 06:28:04
Nayan Hajratwala

It turns out that the problem was the spark library version AND the Scala version. DataStax was running Spark 1.4.1 and Scala 2.10.5, while my eclipse project was using 1.5.2 & 2.11.7 respectively.

Note that BOTH the Spark library and Scala appear to have to match. I tried other combinations, but it only worked when both matched.

I am getting pretty familiar with this part of your posted error:

WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://...

It can have numerous causes, pretty much all related to misconfigured IPs. First I would do whatever zero323 says, then here's my two cents: I have solved my own problems recently by using IP addresses, not hostnames, and the only config I use in a simple standalone cluster is SPARK_MASTER_IP.

SPARK_MASTER_IP in the $SPARK_HOME/conf/spark-env.sh on your master then should lead the master webui to show the IP address you set:

spark://your.ip.address.numbers:7077

And your SparkConf setup can refer to that.

Having said that, I am not familiar with your specific implementation but I notice in the error two occurrences containing:

/private/var/folders/pd/6rxlm2js10gg6xys5wm90qpm0000gn/T/

Have you looked there to see if there's a logs directory? Is that where $DSE_HOME points? Alternatively connect to the driver where it creates it's webui:

INFO SparkUI: Started SparkUI at http://10.0.1.88:4040

and you should see a link to an error log there somewhere.

More on the IP vs. hostname thing, this very old bug is marked as Resolved but I have not figured out what they mean by Resolved, so I just tend toward IP addresses.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!