Running Spark on Windows Error 5 (Access Denied) even when running as Admin

前端 未结 1 1010
Happy的楠姐
Happy的楠姐 2021-01-07 08:19

I\'m beginning with Spark so not really sure where my problem is and looking for a helpful hint here. I\'m trying to run Spark (pyspark) on a windows 7 machine as an admin b

1条回答
  •  暖寄归人
    2021-01-07 08:45

    Briefly:

    I had what should be the same problem. For me, it was that the *.cmd files in the $spark/bin directory weren't marked as executable; please try to confirm by:

    • right clicking on pyspark2.cmd and:
    • properties / security tab then examine 'Read & execute'

    I found the workaround on another site, that recommended downloading hadoop-winutils-2.6.0.zip (sorry don't have a link). Here is an example of the cmd to use (after moving to proper directory):

    t:\hadoop-winutils-2.6.0\bin\winutils.exe chmod 777 *
    

    I did need to run the chmod 777 cmd to make the /tmp/hive writeable too. good luck!

    (... new here - sorry for the poor formatting)
    (update: Matt thanks for fixing formatting issues!)

    root cause: the tar program i used on windows via tar -zxf did not apply the proper attributes to the extracted files. in this case the 'executable' files weren't properly set. yeah, maybe i should update my version of cygwin.

    0 讨论(0)
提交回复
热议问题