With a fresh install of Spark 2.1, I am getting an error when executing the pyspark command.
Traceback (most recent call last): File \"/usr/local/spark/pytho
You are missing the spark-hive jar.
For example, if you are running on Scala 2.11, with Spark 2.1, you can use this jar.
https://mvnrepository.com/artifact/org.apache.spark/spark-hive_2.11/2.1.0