I got invocation target exception when running project with spark 1.3 lib in maven in IntelliJ.
I got met this error only in IntelliJ IDE. After I deployed the jar and ran via spark-submit, the error went out.
Any one has met with the same problem before? I hope to fix this problem so as to do easy-debugging. otherwise I have to package the jar every time when I want to run the code.
details are as below:
2015-04-21 09:39:13 ERROR MetricsSystem:75 - Sink class org.apache.spark.metrics.sink.MetricsServlet cannot be instantialized 2015-04-21 09:39:13 ERROR TrainingSFERunner:144 - java.lang.reflect.InvocationTargetException 2015-04-20 16:08:44 INFO BlockManagerMaster:59 - Registered BlockManager Exception in thread "main" java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:187) at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:181) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98) at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) at scala.collection.mutable.HashMap.foreach(HashMap.scala:98) at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:181) at org.apache.spark.metrics.MetricsSystem.start(MetricsSystem.scala:98) at org.apache.spark.SparkContext.<init>(SparkContext.scala:390) at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61) at spark.mllibClassifier.JavaRandomForests.run(JavaRandomForests.java:105) at spark.mllibClassifier.SparkMLlibMain.runMain(SparkMLlibMain.java:263) at spark.mllibClassifier.JavaRandomForests.main(JavaRandomForests.java:221) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134) Caused by: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.module.SimpleSerializers.<init>(Ljava/util/List;)V at com.codahale.metrics.json.MetricsModule.setupModule(MetricsModule.java:223) at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:469) at org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:45) My pom file is as follows.
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>projects</groupId> <artifactId>project1</artifactId> <version>1.0-SNAPSHOT</version> <dependencies> <dependency> <groupId>org.apache.commons</groupId> <artifactId>commons-lang3</artifactId> <version>3.0</version> </dependency> <dependency> <groupId>org.apache.lucene</groupId> <artifactId>lucene-core</artifactId> <version>5.0.0</version> </dependency> <dependency> <groupId>org.apache.lucene</groupId> <artifactId>lucene-analyzers-common</artifactId> <version>5.0.0</version> </dependency> <dependency> <groupId>log4j</groupId> <artifactId>log4j</artifactId> <version>1.2.17</version> </dependency> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>3.8.1</version> <scope>test</scope> </dependency> <dependency> <groupId>commons-io</groupId> <artifactId>commons-io</artifactId> <version>2.1</version> <scope>test</scope> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>1.3.0</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-mllib_2.10</artifactId> <version>1.3.0</version> </dependency> <dependency> <groupId>colt</groupId> <artifactId>colt</artifactId> <version>1.2.0</version> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>3.2</version> <configuration> <source>1.7</source> <target>1.7</target> </configuration> </plugin> <plugin> <artifactId>maven-assembly-plugin</artifactId> <executions> <execution> <phase>package</phase> <goals> <goal>single</goal> </goals> </execution> </executions> <configuration> <descriptorRefs> <descriptorRef>jar-with-dependencies</descriptorRef> </descriptorRefs> </configuration> </plugin> </plugins> </build> </project>