问题
I'm getting the following exception when I'm trying to submit a Spark application to a Mesos cluster:
17/01/31 17:04:21 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/01/31 17:04:22 ERROR SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Could not parse Master URL: 'mesos://localhost:5050' at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2550) at org.apache.spark.SparkContext.(SparkContext.scala:501)
回答1:
You probably used a wrong command to build Spark, e.g., missing -Pmesos
. You should build it using ./build/mvn -Pmesos -DskipTests clean package
since Spark 2.1.0.
来源:https://stackoverflow.com/questions/41968484/why-does-submitting-a-spark-application-to-mesos-fail-with-could-not-parse-mast