The following exception is occurring when running a spark unit test that requires snappy compression:
java.lang.reflect.InvocationTargetException
at sun.
Faced this problem with clean standalone installation of Spark 1.6.1. To solve it I had to:
1) manually add libsnappyjava.jnilib (it's in the jar) to java.library.path (which includes multiple locations, ~/Library/Java/Extensions/ is fine)
2) add snappy-java-1.1.2.4.jar to Spark's classpath (in spark-env.sh add "export SPARK_CLASSPATH=.../snappy-java-1.1.2.4.jar"
Another solution is to upgrade your version of snappy. While this problem exists in 1.0.4.1, it was fixed in 1.0.5. Adding an exclusion in spark dependencies like
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>${spark.version}</version>
<exclusions>
<exclusion>
<groupId>org.xerial.snappy</groupId>
<artifactId>snappy-java</artifactId>
</exclusion>
</exclusions>
</dependency>
and then adding
<dependency>
<groupId>org.xerial.snappy</groupId>
<artifactId>snappy-java</artifactId>
<version>1.0.5</version>
</dependency>
did it for me.
The way to handle this is to update the Intellij Run Configuration. Add the following to the JVM parameters:
-Dorg.xerial.snappy.lib.name=libsnappyjava.jnilib -Dorg.xerial.snappy.tempdir=/tmp
I experienced the same error. the version of the spark-core was: 1.3.0-cdh5.4.3
once i changed it to: 1.3.0 it fixed it.
note that it is "provided" so on production it doesn't matter, its only for development machine.
edit: i found a more reasonable solution. the problem is resulted from a bug in the snappy compression of Java in OSX. so to resolve it, you can add to your pom file:
<dependency>
<groupId>org.xerial.snappy</groupId>
<artifactId>snappy-java</artifactId>
<version>1.1.2</version>
<type>jar</type>
<scope>provided</scope>
</dependency>