Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2

匿名 (未验证) 提交于 2019-12-03 00:59:01

问题:

While compiling the Maven project the following error occured:

[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-streaming-flume-sink_2.10 ---  [WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile  [INFO] Using incremental compilation  [INFO] Compiling 6 Scala sources and 3 Java sources to /home/gorlec/Desktop/test/external/flume-sink/target/scala-2.10/classes...  [ERROR] /home/gorlec/Desktop/test/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:47: identifier expected but 'with' found.  [ERROR] with Logging {  [ERROR] ^  [ERROR] one error found  [INFO] ------------------------------------------------------------------------  [INFO] BUILD FAILURE  [INFO] ------------------------------------------------------------------------  [INFO] Total time: 7.992s  [INFO] Finished at: Fri Apr 15 17:44:33 CEST 2016  [INFO] Final Memory: 25M/350M  [INFO] ------------------------------------------------------------------------  [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-streaming-flume-sink_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed ->  [Help 1]  [ERROR] 

I removed the property <useZincServer>true</useZincServer> from pom.xml, and still the Logging error persists.

[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-streaming-flume-sink_2.10 ---  [INFO] Using incremental compilation  [INFO] Compiling 6 Scala sources and 3 Java sources to /home/gorlec/Desktop/test/external/flume-sink/target/scala-2.10/classes...  [ERROR] /home/gorlec/Desktop/test/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:47: identifier expected but 'with' found.  [ERROR] with Logging {  [ERROR] ^  [ERROR] one error found  [INFO] ------------------------------------------------------------------------  [INFO] BUILD FAILURE  [INFO] ------------------------------------------------------------------------  [INFO] Total time: 5.814s  [INFO] Finished at: Fri Apr 15 17:41:00 CEST 2016  [INFO] Final Memory: 25M/335M  [INFO] ------------------------------------------------------------------------  [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-streaming-flume-sink_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed ->  [Help 1]  [ERROR]  [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.  [ERROR] Re-run Maven using the -X switch to enable full debug logging. 

I checked that PATH and JAVA_HOME are defined in ~/.bashrc as follows:

export PATH=$PATH:/usr/lib/jvm/java-7-openjdk-amd64/bin export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64 

The only issue that I noticed is that echo $JAVA_HOME gives an empty output, though I did source ~/.bashrc.

Any help is highly appreciated.

回答1:

It is strange echo $JAVA_HOME gives an empty output.While compling the Spakr Source,I import the mvn clean package success porject into eclipse,I meet the same problem.And I found the solution here: How to solve “Plugin execution not covered by lifecycle configuration” for Spring Data Maven Builds



回答2:

I think you were compiling Spark with Scala 2.10. If so, you should do as follows.

cd /path/to/Spark ./dev/change-scala-version.sh 2.10 ./build/mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package 

Hope this helps.



回答3:

The problem could be this [INFO] Using incremental compilation

In your pom.xml try to remove the line
<recompileMode>incremental</recompileMode>
and then try again.



标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!