问题
I keep getting a spark library conflict with cosmosdb libraries and unable to resolve it. Please help???
build.sbt
name := "myApp"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.3.0",
"org.apache.spark" % "spark-sql_2.11" % "2.3.0" ,
"org.apache.spark" % "spark-streaming_2.11" % "2.3.0" ,
"org.apache.spark" % "spark-mllib_2.11" % "2.3.0" ,
"com.microsoft.azure" % "azure-storage" % "2.0.0",
"org.apache.hadoop" % "hadoop-azure" % "2.7.3",
"com.microsoft.azure" % "azure-cosmosdb-spark_2.2.0_2.11" % "1.0.0",
"com.microsoft.azure" % "azure-documentdb" % "1.14.2" ,
"com.microsoft.azure" % "azure-documentdb-rx" % "0.9.0-rc2" ,
"io.reactivex" % "rxjava" % "1.3.0" ,
"io.reactivex" % "rxnetty" % "0.4.20",
"org.json" % "json" % "20140107",
"org.jmockit" % "jmockit" % "1.34" % "test"
)
The compilation errors I'm getting are:
[warn] Run 'evicted' to see detailed eviction warnings
[error] Modules were resolved with conflicting cross-version suffixes in
[error] org.apache.spark:spark-launcher _2.11, _2.10
[error] org.json4s:json4s-ast _2.11, _2.10
[error] org.apache.spark:spark-network-shuffle _2.11, _2.10
[error] com.twitter:chill _2.11, _2.10
[error] org.json4s:json4s-jackson _2.11, _2.10
[error] com.fasterxml.jackson.module:jackson-module-scala _2.11, _2.10
[error] org.json4s:json4s-core _2.11, _2.10
[error] org.apache.spark:spark-unsafe _2.11, _2.10
[error] org.apache.spark:spark-core _2.11, _2.10
[error] org.apache.spark:spark-network-common _2.11, _2.10
[error] java.lang.RuntimeException: Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, org.json4s:json4s-ast, org.apache.spark:spark-network-shuffle, com.twitter:chill, org.json4s:json4s-jackson, com.fasterxml.jackson.module:jackson-module-scala, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.apache.spark:spark-core, org.apache.spark:spark-network-common
Thanks
回答1:
I have come across this issue. You probably have to use Scala 2.10 with the above inorder to avoid conflict.
来源:https://stackoverflow.com/questions/49353714/spark-libraries-conflect-when-cosmosdb-lib