问题
We are currently using hadoop-2.8.0
on a 10 node cluster and are planning to upgrade to latest hadoop-3.0.0
.
I want to know whether there will be any issue if we use hadoop-3.0.0
with an older version of Spark and other components such as Hive, Pig and Sqoop.
回答1:
Latest Hive version does not support Hadoop3.0.It seems that Hive may be established on Spark or other calculating engines in the future.
来源:https://stackoverflow.com/questions/47920005/how-is-hadoop-3-0-0-s-compatibility-with-older-versions-of-hive-pig-sqoop-and