I\'m beginning with Spark so not really sure where my problem is and looking for a helpful hint here. I\'m trying to run Spark (pyspark) on a windows 7 machine as an admin b
Briefly:
I had what should be the same problem. For me, it was that the *.cmd files in the $spark/bin directory weren't marked as executable; please try to confirm by:
pyspark2.cmd and:I found the workaround on another site, that recommended downloading hadoop-winutils-2.6.0.zip (sorry don't have a link). Here is an example of the cmd to use (after moving to proper directory):
t:\hadoop-winutils-2.6.0\bin\winutils.exe chmod 777 *
I did need to run the chmod 777 cmd to make the /tmp/hive writeable too.
good luck!
(... new here - sorry for the poor formatting)
(update: Matt thanks for fixing formatting issues!)
root cause: the tar program i used on windows via tar -zxf did not apply
the proper attributes to the extracted files. in this case the 'executable' files
weren't properly set. yeah, maybe i should update my version of cygwin.