How to correctly set python version in Spark?
问题 My spark version is 2.4.0, it has python2.7 and python 3.7 . The default version is python2.7. Now I want to submit a pyspark program which uses python3.7. I tried two ways, but both of them don't work. spark2-submit --master yarn \ --conf "spark.pyspark.python=/usr/bin/python3" \ --conf "spark.pyspark.driver.python=/usr/bin/python3" pi.py It doesn't work and says Cannot run program "/usr/bin/python3": error=13, Permission denied But actually, I have the permission, for example, I can use