Spark without Hadoop: Failed to Launch

前端 未结 1 372
温柔的废话
温柔的废话 2020-12-10 13:03

I\'m running Spark 2.1.0, Hive 2.1.1 and Hadoop 2.7.3 on Ubuntu 16.04.

I download the Spark project from github and build the \"without hadoop\" version:

相关标签:
1条回答
  • 2020-12-10 14:04

    “Hadoop free” builds need to modify SPARK_DIST_CLASSPATH to include Hadoop’s package jars.

    The most convenient place to do this is by adding an entry in conf/spark-env.sh :

    export SPARK_DIST_CLASSPATH=$(/path/to/hadoop/bin/hadoop classpath)  
    

    check this https://spark.apache.org/docs/latest/hadoop-provided.html

    0 讨论(0)
提交回复
热议问题