问题
I have installed below on my windows 10 machine to use the Apache Spark.
Java, Python 3.6 and Spark (spark-2.3.1-bin-hadoop2.7)
I am trying to write pyspark related code in VSCode. It is showing red underline under the 'from ' and showing error message
E0401:Unable to import 'pyspark'
I have also used ctrl+Shift+P and select "Python:Update workspace Pyspark libraries". It is showing notification message
Make sure you have SPARK_HOME environment variable set to the root path of the local spark installation!
What is wrong?
回答1:
You will need to install the pyspark Python package using pip install pyspark
. Actually, this is the only package you'll need for VSCode, unless you also want to run your Spark application on the same machine.
来源:https://stackoverflow.com/questions/52185767/e0401unable-to-import-pyspark-in-vscode-in-windows-10