The SPARK_HOME env variable is set but Jupyter Notebook doesn't see it. (Windows)

前提是你 提交于 2020-07-17 11:14:27

问题


I'm on Windows 10. I was trying to get Spark up and running in a Jupyter Notebook alongside Python 3.5. I installed a pre-built version of Spark and set the SPARK_HOME environmental variable. I installed findspark and run the code:

import findspark
findspark.init()

I receive a Value error:

ValueError: Couldn't find Spark, make sure SPARK_HOME env is set or Spark is in an expected location (e.g. from homebrew installation).

However the SPARK_HOME variable is set. Here is a screenshot that shows that the list of environmental variables on my system.

Has anyone encountered this issue or would know how to fix this? I only found an old discussion in which someone had set SPARK_HOME to the wrong folder but I don't think it's my case.


回答1:


I had same problem and had it solved by installing "vagrant" and "virtual box". (Note, though I use Mac OS and Python 2.7.11)

Take a look at this tutorial, which is for the Harvard CS109 course : https://github.com/cs109/2015lab8/blob/master/installing_vagrant.pdf

After "vagrant reload" on the terminal , I am able to run my codes without errors. NOTE the difference between the result of command "os.getcwd" shown in the attached images.




回答2:


I had the same problem and wasted a lot of time. I found two solutions:

There are two solutions

  1. copy downloaded spark folder in somewhere in C directory and give the link as below

    import findspark
    findspark.init('C:/spark')
    
  2. use the function of findspark to find automatically the spark folder

    import findspark
    findspark.find()
    



回答3:


The environmental variables get updated only after system reboot. It works after restarting your system.




回答4:


I had the same problem when installing spark using pip install pyspark findspark in a conda environment.

The solution was to do this:

export /Users/pete/miniconda3/envs/cenv3/lib/python3.6/site-packages/pyspark/
jupyter notebook

You'll have to substitute the name of your conda environment for cenv3 in the command above.




回答5:


Restarting the system after setting up the environmental variables worked for me.




回答6:


i have same problem, i solved it by closing cmd then open again. i forget that after editing env variable on windows that should restart cmd..




回答7:


I got the same error. Initially, I had stored my Spark folder in the Documents directory. Later, when I moved it to the Desktop, it suddenly started recognizing all the system variables and it ran findspark.init() without any error.

Try it out once.



来源:https://stackoverflow.com/questions/38411914/the-spark-home-env-variable-is-set-but-jupyter-notebook-doesnt-see-it-windows

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!