问题
I am trying to run a job via spark-submit
.
The error that results from this job is:
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2625)
at java.lang.Class.getMethod0(Class.java:2866)
at java.lang.Class.getMethod(Class.java:1676)
at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 6 more
Not sure if it matters, but I am trying to run this job within a Docker container on Mesos. Spark is 1.61, Mesos is 0.27.1, Python is 3.5, and Docker is 1.11.2. I am running in client mode.
Here is the gist of my spark-submit
statement:
export SPARK_PRINT_LAUNCH_COMMAND=true
./spark-submit \
--master mesos://mesos-blahblahblah:port \
--conf spark.mesos.executor.docker.image=docker-registry:spark-docker-image \
--conf spark.mesos.executor.home=/usr/local/spark \
--conf spark.executorEnv.MESOS_NATIVE_JAVA_LIBRARY=/usr/local/lib/libmesos.dylib \
--conf spark.shuffle.service.enabled=true \
--jars ~/spark/lib/slf4j-simple-1.7.21.jar \
test.py
The gist of test.py
is that it loads data from parquet, sorts it by a particular column, and then writes it back to parquet.
I added the --jars
line when I kept getting that error (the error is not appearing in my driver - I navigate through the Mesos Framework to look at the stderr from each Mesos task to find it)
I also tried adding --conf spark.executor.extraClassPath=http:some.ip:port/jars/slf4j-simple-1.7.21.jar
,
because I noticed when I ran the spark-submit
from above it would output
INFO SparkContext: Added JAR file:~/spark/lib/slf4j-simple-1.7.21.jar at http://some.ip:port/jars/slf4j-simple-1.7.21.jar with timestamp 1472138630497
But the error is unchanged. Any ideas?
I found this link, which makes me think it is a bug. But the person hasn't posted any solution.
回答1:
So the Exception is correct - org/slf4j/Logger is not present in the mentioned "slf4j-simple-1.7.21" jar:
└── org
└── slf4j
└── impl
├── SimpleLogger$1.class
├── SimpleLogger.class
├── SimpleLoggerFactory.class
├── StaticLoggerBinder.class
├── StaticMDCBinder.class
└── StaticMarkerBinder.class
Include the proper jar (try slf4j-api-1.7.21.jar)
(Hint - You can simply check the content of the jar file by unzipping it)
回答2:
I had this exact same problem and was also trying to run Mesos/Spark/Python on Docker.
The thing that finally fixed it for me was to add the hadoop classpath
output to the Classpath of the Spark executors using the spark.executor.extraClassPath
configuration option.
The full command I ran to get it to work was:
MESOS_NATIVE_JAVA_LIBRARY=/usr/local/lib/libmesos.so \
${SPARK_HOME}/bin/pyspark --conf spark.master=mesos://mesos-master:5050 --driver-class-path $(${HADOOP_HOME}/bin/hadoop classpath) --conf spark.executor.extraClassPath=$(${HADOOP_HOME}/bin/hadoop classpath)
来源:https://stackoverflow.com/questions/39149551/missing-slf4j-logger-on-spark-workers