I\'m not able to run a simple spark job in Scala IDE (Maven spark project) installed on Windows 7
Spark core dependency has be
I got the same problem while running unit tests. I found this workaround solution:
The following workaround allows to get rid of this message:
File workaround = new File(".");
System.getProperties().put("hadoop.home.dir", workaround.getAbsolutePath());
new File("./bin").mkdirs();
new File("./bin/winutils.exe").createNewFile();
from: https://issues.cloudera.org/browse/DISTRO-544