With a fresh install of Spark 2.1, I am getting an error when executing the pyspark command.
Traceback (most recent call last): File \"/usr/local/spark/pytho
I saw this error on a new (2018) Mac, which came with Java 10. The fix was to set JAVA_HOME to Java 8:
JAVA_HOME
export JAVA_HOME=`usr/libexec/java_home -v 1.8`