Unable to read local file with HDFS API in IntelliJ Plugin

≡放荡痞女 提交于 2020-02-06 08:01:53

问题


I'm trying to write an IntelliJ plugin that reads local files using the Hadoop HDFS API (because I eventually want to read Parquet files, and the only way to do that is through Hadoop).

I have a minimal codebase, using plugins

plugins {
  id 'java'
  id 'org.jetbrains.intellij' version '0.4.16'
}

dependencies

compile("org.apache.hadoop:hadoop-client:3.2.1")

and code

Configuration conf = new Configuration();
Path path = new Path(file.getPath());
try {
  FileSystem fs = path.getFileSystem(conf);
} catch (Exception e) {
  System.out.println(e);
}

My unit tests run without error, but when I build the plugin and run it within IntelliJ I get the error

org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "file"

If I follow the suggestion in this SO question and add

conf.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());

then I get

java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.LocalFileSystem not found

I have tried depending on several combinations of Hadoop jars, manually replacing all jars in the zip/libs dir with a single shadowjar (with mergeServiceFiles() enabled), again with no luck. Everything seems to be in the right place - I can see the META-INF/services file is correct, classes are all there in the jar, etc.

来源:https://stackoverflow.com/questions/60019230/unable-to-read-local-file-with-hdfs-api-in-intellij-plugin

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!