How to resolve Guava dependency issue while submitting Uber Jar to Google Dataproc

后端 未结 1 1690
傲寒
傲寒 2020-12-12 02:25

I am using maven shade plugin to build Uber jar for submitting it as a job to google dataproc cluster. Google have installed Apache Spark 2.0.2 Apache Hadoop 2.7.3 on their

1条回答
  •  被撕碎了的回忆
    2020-12-12 02:51

    Edited: See https://cloud.google.com/blog/products/data-analytics/managing-java-dependencies-apache-spark-applications-cloud-dataproc for a fully worked example for Maven and SBT.

    Original Answer When I make uber jars to run on Hadoop / Spark / Dataproc, I often use whichever version of guava suits my needs and then use a shade relocation which allows the different versions to co-exist without issue:

    
      org.apache.maven.plugins
      maven-shade-plugin
      2.3
      
        
          package
          
            shade
          
          
          
              
                com.google.guava:*
              
          
          false
          
              
                com.google.common
                repackaged.com.google.common
              
          
          true
          
      
    
    
    

    0 讨论(0)
提交回复
热议问题