Spark-HBase - GCP template (1/3) - How to locally package the Hortonworks connector?

此生再无相见时 提交于 2021-02-17 06:30:36

问题


I'm trying to test the Spark-HBase connector in the GCP context and tried to follow [1], which asks to locally package the connector [2] using Maven (I tried Maven 3.6.3) for Spark 2.4, and leads to following issue.

Error "branch-2.4":

[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project shc-core: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: NullPointerException -> [Help 1]

References

[1] https://github.com/GoogleCloudPlatform/cloud-bigtable-examples/tree/master/scala/bigtable-shc

[2] https://github.com/hortonworks-spark/shc/tree/branch-2.4


回答1:


As suggested in the comments (thanks @Ismail !), using Java 8 works to build the connector:

sdk use java 8.0.275-zulu

mvn clean package -DskipTests

One can then import the jar in Dependencies.scala of the GCP template as follows.

...
val shcCore = "com.hortonworks" % "shc-core" % "1.1.3-2.4-s_2.11" from "file:///<path_to_jar>/shc-core-1.1.3-2.4-s_2.11.jar"
...
// shcCore % (shcVersionPrefix + scalaBinaryVersion) excludeAll(
shcCore excludeAll(
...


来源:https://stackoverflow.com/questions/65429730/spark-hbase-gcp-template-1-3-how-to-locally-package-the-hortonworks-connec

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!