Unable to import SparkContext
问题 I'm working on CentOS, I've setup $SPARK_HOME and also added path to bin in $PATH . I can run pyspark from anywhere. But when I try to create python file and uses this statement; from pyspark import SparkConf, SparkContext it throws following error python pysparktask.py Traceback (most recent call last): File "pysparktask.py", line 1, in <module> from pyspark import SparkConf, SparkContext ModuleNotFoundError: No module named 'pyspark' I tried to install it again using pip . pip install