Class com.hadoop.compression.lzo.LzoCodec not found for Spark on CDH 5?

前端 未结 3 935
别那么骄傲
别那么骄傲 2020-12-05 20:32

I have been working on this problem for two days and still have not find the way.

Problem: Our Spark installed via newest CDH 5 always complains abo

3条回答
  •  一生所求
    2020-12-05 20:50

    For Hortonworks 2.3.0 with Ambari for Spark to work with LZO you need to add Custom spark-defaults properties. I added:

    • spark.driver.extraClassPath /usr/hdp/current/hadoop-client/lib/hadoop-lzo-0.6.0.{{hdp_full_version}}.jar
    • spark.driver.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64

    This is based on the HDP 2.3.0 upgrading SPARK 2.2 page (it has some typos).

提交回复
热议问题