I got the following error when starting the spark-shell. I\'m going to use Spark to process data in SQL Server. Can I ignore the errors?
java.io.IOExcept
tl;dr You'd rather not.
Well, it may be possible, but given you've just started your journey to Spark's land the efforts would not pay off.
Windows has never been a developer-friendly OS to me and whenever I teach people Spark and they use Windows I just take it as granted that we'll have to go through the winutils.exe
setup but many times also how to work on command line.
Please install winutils.exe
as follows:
cmd
as administratorc:\hadoop\bin
bin
), e.g. set HADOOP_HOME=c:\hadoop
%HADOOP_HOME%\bin
c:\tmp\hive
directorywinutils.exe chmod -R 777 \tmp\hive
spark-shell
and run spark.range(1).show
to see a one-row dataset.