Unable to connect to HDFS using PDI step

自古美人都是妖i 提交于 2019-12-21 21:22:39

问题


I have successfully configured Hadoop 2.4 in an Ubuntu 14.04 VM from a Windows 8 system. Hadoop installation is working absolutely fine and also i am able to view the Namenode from my windows browser. Attached Image Below:

So, my host name is : ubuntu and hdfs port : 9000 (correct me if I am wrong).

Core-site.xml :

<property>
  <name>fs.defaultFS</name>
  <value>hdfs://ubuntu:9000</value>
</property>

The issue is while connecting to HDFS from my Pentaho Data Integration Tool. Attached Image Below. PDI version: 4.4.0 Step Used: Hadoop Copy Files

Please kindly help me in connecting to HDFS using PDI. Do i need to install or update any jar for this ?? Please let me know in case you need more information.


回答1:


PDI 4.4 afaik doesn't have support for Hadoop 2.4. In any case, there is a property in a file you must set to use a particular Hadoop configuration (you may see "Hadoop configuration" referred to as a "shim" in the forums, etc.). In the data-integration/plugins/pentaho-big-data-plugin/plugin.properties file there is a property called active.hadoop.configuration, it is set by default to "hadoop-20" which refers to an Apache Hadoop 0.20.x distribution. You will want to set it to the "newest" distro that comes with Pentaho, or build your own shim as described in my blog post:

http://funpdi.blogspot.com/2013/03/pentaho-data-integration-44-and-hadoop.html

Upcoming versions (5.2+) of PDI will support vendor distributions that include Hadoop 2.4+, so keep your eye out on the PDI Marketplace and on pentaho.com :)



来源:https://stackoverflow.com/questions/25043374/unable-to-connect-to-hdfs-using-pdi-step

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!