EMR 5.21 , Spark 2.4 - Json4s Dependency broken

霸气de小男生 提交于 2021-02-09 20:45:01

问题


Issue

In EMR 5.21 , Spark - Hbase integration is broken.
df.write.options().format().save() fails.
Reason is json4s-jackson version 3.5.3 in spark 2.4 , EMR 5.21
it works fine in EMR 5.11.2 , Spark 2.2 , son4s-jackson version 3.2.11
Problem is this is EMR so i cant rebuild spark with lower json4s .
is there any workaround ?

Error

py4j.protocol.Py4JJavaError: An error occurred while calling o104.save. : java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.parse(Lorg/json4s/JsonInput;Z)Lorg/json4s/JsonAST$JValue;

Submission

spark-submit --master yarn \
--jars /usr/lib/hbase/  \
--packages com.hortonworks:shc-core:1.1.3-2.3-s_2.11 \
--repositories http://repo.hortonworks.com/content/groups/public/  \
pysparkhbase_V1.1.py s3://<bucket>/ <Namespace> <Table> <cf> <Key>

Code

import sys
from pyspark.sql.functions import concat
from pyspark import SparkContext
from pyspark.sql import SQLContext,SparkSession
spark = SparkSession.builder.master("yarn").appName("PysparkHbaseConnection").config("spark.some.config.option", "PyHbase").getOrCreate()
spark.sql("set spark.sql.parquet.compression.codec=uncompressed")
spark.conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
data_source_format = 'org.apache.spark.sql.execution.datasources.hbase'
df = spark.read.parquet(file)
df.createOrReplaceTempView("view")
.
cat = '{|"table":{"namespace":"' + namespace + '", "name":"' + name + '", "tableCoder":"' + tableCoder + '", "version":"' + version + '"}, \n|"rowkey":"' + rowkey + '", \n|"columns":{'
.
df.write.options(catalog=cat).format(data_source_format).save()

回答1:


There's no obvious answer. A quick check of the SHC POM doesn't show a direct import of the json file, so you can't just change that pom & build the artifact yourself.

You're going to have to talk to the EMR team to get them to build the connector & HBase in sync.

FWIW, getting jackson in sync is one of the stress point of releasing a big data stack, and the AWS SDK's habit of updating their requirements on their fortnightly release one of the stress points ... Hadoop moved to the aws shaded SDK purely to stop the AWS engineering decisions defining the choices for everyone.




回答2:


downgrade json4s to 3.2.10 can resolve it. but I think it's SHC bug,need to upgrade it.



来源:https://stackoverflow.com/questions/55070647/emr-5-21-spark-2-4-json4s-dependency-broken

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!