rhadoop

Failed to remotely execute R script which loads library “rhdfs”

余生颓废 提交于 2019-12-02 06:20:34
I'm working on a project using R-Hadoop, and got this problem. I'm using JSch in JAVA to ssh to remote hadoop pseudo-cluster, and here are part of Java code to create connection. /* Create a connection instance */ Connection conn = new Connection(hostname); /* Now connect */ conn.connect(); /* Authenticate */ boolean isAuthenticated = conn.authenticateWithPassword(username, password); if (isAuthenticated == false) throw new IOException("Authentication failed."); /* Create a session */ Session sess = conn.openSession(); //sess.execCommand("uname -a && date && uptime && who"); sess.execCommand(

Streaming Command Failed! in RHADOOP

时光毁灭记忆、已成空白 提交于 2019-12-01 16:44:46
I have installed RHADOOP in Hortonwork VM. when I am running mapreduce code to verify it is throwing an error saying I am using user as :rstudio (not root.but has access to sudoer) Streaming Command Failed! Can anybody help me understanding the issue.I am not getting much idea to solve thios issue. Sys.setenv(HADOOP_HOME="/usr/hdp/2.2.0.0-2041/hadoop") Sys.setenv(HADOOP_CMD="/usr/bin/hadoop") Sys.setenv(HADOOP_STREAMING="/usr/hdp/2.2.0.0-2041/hadoop-mapreduce/hadoop-streaming.jar") library(rhdfs) hdfs.init() library(rmr2) ints = to.dfs(1:10) calc = mapreduce(input = ints, map = function(k, v)