Hadoop HADOOP_CLASSPATH issues

倖福魔咒の 提交于 2019-12-30 02:25:29

问题


This question doesn't refer to distributing jars in the whole cluster for the workers to use them.

It refers to specifying a number of additional libraries on the client machine. To be more specific: I'm trying to run the following command in order to retrieve the contents of a SequenceFile:

   /path/to/hadoop/script fs -text /path/in/HDFS/to/my/file

It throws me this error: text: java.io.IOException: WritableName can't load class: util.io.DoubleArrayWritable

I have a writable class called DoubleArrayWritable. In fact , on another computer everything works well.

I tried to set the HADOOP_CLASSPATH to include the jar containing that class but with no results. Actually, when running:

   /path/to/hadoop/script classpath 

The result doesn't contain the jar which I added to HADOOP_CLASSPATH.

The question is: how do you specify extra libraries when running hadoop (by extra meaning other libraries than the ones which the hadoop script includes automatically in the classpath)

Some more info which might help:

  • I can't modify the hadoop.sh script (nor any associated scripts)
  • I can't copy my library to the /lib directory under the hadoop installation directory
  • In the hadoop-env.sh which is run from the hadoop.sh there is this line: export HADOOP_CLASSPATH=$HADOOP_HOME/lib which probably explains why my HADOOP_CLASSPATH env var is ignored.

回答1:


If you are allowed to set HADOOP_CLASSPATH then

export HADOOP_CLASSPATH=/path/to/jar/myjar.jar:$HADOOP_CLASSPATH; \
    hadoop fs -text /path/in/HDFS/to/my/file

will do the job. Since in your case this variable is overridden in hadoop-env.sh therefore, consider using the -libjars option instead:

hadoop fs -libjars /path/to/jar/myjar.jar -text /path/in/HDFS/to/my/file

Alternatively invoke FsShell manually:

java -cp $HADOOP_HOME/lib/*:/path/to/jar/myjar.jar:$CLASSPATH \
org.apache.hadoop.fs.FsShell -conf $HADOOP_HOME/conf/core-site.xml \
-text /path/in/HDFS/to/my/file



回答2:


If someone wants to check hadoop classpath, enter hadoop classpath in terminal.
To compile it, use this: javac -cp $(hadoop classpath):path/to/jars/* java_file.java




回答3:


Try to add your jar file in default CLASSPATH variable and also append HADOOP_CLASSPATH to it. Then execute your command.

export CLASSPATH=/your/jar/file/myjar.jar:$CLASSPATH:$HADOOP_CLASSPATH /path/to/hadoop/script fs -text /path/in/HDFS/to/my/file



来源:https://stackoverflow.com/questions/12940239/hadoop-hadoop-classpath-issues

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!