Detected Guava issue #1635 which indicates that a version of Guava less than 16.01 is in use

前端 未结 6 682
野性不改
野性不改 2021-01-18 10:07

I am running spark job on emr and using datastax connector to connect to cassandra cluster. I am facing issues with the guava jar please find the details as below I am using

6条回答
  •  佛祖请我去吃肉
    2021-01-18 10:22

    Thanks Adrian for your response.

    I am on a little of a different architecture than everybody else on the thread but the Guava problem is still the same. I am using spark 2.2 with mesosphere. In our development environment we use sbt-native-packager to produce our docker images to pass into mesos.

    Turns out, we needed to have a different guava for the spark submit executors than we need for the code that we run on the driver. This worked for me.

    build.sbt

    ....
    libraryDependencies ++= Seq(
      "com.google.guava" % "guava" % "19.0" force(),
      "org.apache.hadoop" % "hadoop-aws" % "2.7.3" excludeAll (
        ExclusionRule(organization = "org.apache.hadoop", name = "hadoop-common"), //this is for s3a
        ExclusionRule(organization = "com.google.guava",  name= "guava" )),
      "org.apache.spark" %% "spark-core" % "2.1.0"   excludeAll (
        ExclusionRule("org.glassfish.jersey.bundles.repackaged", name="jersey-guava"),
        ExclusionRule(organization = "com.google.guava",  name= "guava" )) ,
      "com.github.scopt" %% "scopt" % "3.7.0"  excludeAll (
        ExclusionRule("org.glassfish.jersey.bundles.repackaged", name="jersey-guava"),
        ExclusionRule(organization = "com.google.guava",  name= "guava" )) ,
      "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.6",
    ...
    dockerCommands ++= Seq(
    ...
      Cmd("RUN rm /opt/spark/dist/jars/guava-14.0.1.jar"),
      Cmd("RUN wget -q http://central.maven.org/maven2/com/google/guava/guava/23.0/guava-23.0.jar  -O /opt/spark/dist/jars/guava-23.0.jar")
    ...
    

    When I tried to replace guava 14 on the executors with guava 16.0.1 or 19, it still wouldn't work. Spark submit just died. My fat jar which is actually the guava that is in use for my application in the driver I forced to be 19, but my spark submit executor I had to replace to be 23. I did try replacing to 16 and 19, but spark just died there too.

    Sorry for diverting, but every time after all my google searches this one came up every time. I hope this helps other SBT/mesos folks too.

提交回复
热议问题