Hadoop HADOOP_CLASSPATH issues

后端 未结 3 1814
南旧
南旧 2020-12-30 12:31

This question doesn\'t refer to distributing jars in the whole cluster for the workers to use them.

It refers to specifying a number of additional libraries on the

相关标签:
3条回答
  • 2020-12-30 13:09

    Try to add your jar file in default CLASSPATH variable and also append HADOOP_CLASSPATH to it. Then execute your command.

    export CLASSPATH=/your/jar/file/myjar.jar:$CLASSPATH:$HADOOP_CLASSPATH /path/to/hadoop/script fs -text /path/in/HDFS/to/my/file

    0 讨论(0)
  • 2020-12-30 13:12

    If someone wants to check hadoop classpath, enter hadoop classpath in terminal.
    To compile it, use this: javac -cp $(hadoop classpath):path/to/jars/* java_file.java

    0 讨论(0)
  • 2020-12-30 13:16

    If you are allowed to set HADOOP_CLASSPATH then

    export HADOOP_CLASSPATH=/path/to/jar/myjar.jar:$HADOOP_CLASSPATH; \
        hadoop fs -text /path/in/HDFS/to/my/file
    

    will do the job. Since in your case this variable is overridden in hadoop-env.sh therefore, consider using the -libjars option instead:

    hadoop fs -libjars /path/to/jar/myjar.jar -text /path/in/HDFS/to/my/file
    

    Alternatively invoke FsShell manually:

    java -cp $HADOOP_HOME/lib/*:/path/to/jar/myjar.jar:$CLASSPATH \
    org.apache.hadoop.fs.FsShell -conf $HADOOP_HOME/conf/core-site.xml \
    -text /path/in/HDFS/to/my/file
    
    0 讨论(0)
提交回复
热议问题