Hadoop Mini Cluster Mock (MiniDFSCluster)

﹥>﹥吖頭↗ 提交于 2019-12-21 06:57:59

问题


I need your help about hadoop-minicluster

I'm working with scala (with sbt) and I try to Mock calls of HDFS. I sow hadoop-minicluster for deploying a little cluster and test on it.

However, when I add the sbt dependency :

libraryDependencies += "org.apache.hadoop" % "hadoop-minicluster" % "3.1.0" % Test

The sources are not added and I can't import the package org.apache.hadoop.hdfs.MiniDFSCluster

Do you know how I can solve the problem ?

Thank you for yours answers


回答1:


Surprisingly, it's not in hadoop-minicluster. Try libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % HADOOP_VERSION classifier "tests"

You also may have to exclude some components, such as "org.apache.hadoop" % "hadoop-hdfs" % HADOOP_VERSION classifier "tests" exclude ("javax.servlet", "servlet-api")




回答2:


Thank you very much for your answer.

So to get the tests files and the sources files (for example with DistributedFileSystem), I use this line in my sbt file :

libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "3.1.0" % Test classifier "tests" libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "3.1.0" % Test classifier "tests"

Hadoop-common was needed to compile.

However, I have an other problem when I run my tests :

An exception or error caused a run to abort: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z 

java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z

I sow that it's about HADOOP_HOME in the path, but I did it, but nothing happens..



来源:https://stackoverflow.com/questions/49895193/hadoop-mini-cluster-mock-minidfscluster

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!