I'm attempting to compile libhdfs (a native shared library that allows external apps to interface with hdfs). It's one of the few steps I have to take to mount Hadoop's hdfs using Fuse.
The compilation seems to go well for a while but finishes with "BUILD FAILED" and the following problems summary -
commons-logging#commons-logging;1.0.4: configuration not found in commons-logging#commons-logging;1.0.4: 'master'. It was required from org.apache.hadoop#Hadoop;working@btsotbal800 commons-logging
log4j#log4j;1.2.15: configuration not found in log4j#log4j;1.2.15: 'master'. It was required from org.apache.hadoop#Hadoop;working@btsotbal800 log4j
Now, I have a couple questions about this, in that the book which I'm using to do this doesn't go into any details about what these things really are.
- Are commons-logging and log4j libraries which Hadoop uses?
- These libraries seem to live in $HADOOP_HOME/lib. They are jar files though. Should I extract them, try to change some configurations, and then repack them back into a jar?
- What does 'master' in the errors above mean? Are there different versions of the libraries?
Thank you in advance for ANY insight you can provide.
If you are using cloudera hadoop(cdh3u2), you dont need to build the fuse project.
you can find the binary(libhdfs.so*) inside the directory $HADOOP_HOME/c++/lib
Before fuse mount update the "$HADOOP_HOME/contrib/fuse-dfs/src/fuse_dfs_wrapper.sh" as follows
HADOOP_HOME/contrib/fuse-dfs/src/fuse_dfs_wrapper.sh
#!/bin/bash
for f in ${HADOOP_HOME}/hadoop*.jar ; do
export CLASSPATH=$CLASSPATH:$f
done
for f in ${HADOOP_HOME}/lib/*.jar ; do
export CLASSPATH=$CLASSPATH:$f
done
export PATH=$HADOOP_HOME/contrib/fuse-dfs:$PATH
export LD_LIBRARY_PATH=$HADOOP_HOME/c++/lib:/usr/lib/jvm/java-6-sun-1.6.0.26/jre/lib/amd64/server/
fuse_dfs $@
LD_LIBRARY_PATH contains the list of directories here
"$HADOOP_HOME/c++/lib" contains libhdfs.so and
"/usr/lib/jvm/java-6-sun-1.6.0.26/jre/lib/amd64/server/" contains libjvm.so
\# modify /usr/lib/jvm/java-6-sun-1.6.0.26/jre/lib/amd64/server/
as your java_home
Use the following command for mounting hdfs
fuse_dfs_wrapper.sh dfs://localhost:9000/ /home/510600/mount1
for unmounting use the following command
fusermount -u /home/510600/mount1
I tested fuse only in hadoop pseudo mode not in cluster mode
来源:https://stackoverflow.com/questions/6699851/trying-to-use-fuse-to-mount-hdfs-cant-compile-libhdfs