New to Spark. Downloaded everything alright but when I run pyspark I get the following errors:
Type \"help\", \"copyright\", \"credits\" or \"license\" for more
I also encountered this issue on Windows 7 with pre-built Spark 2.2. Here is a possible solution for Windows guys:
make sure you get all the environment path set correctly, including SPARK_PATH, HADOOP_HOME, etc.
get the correct version of winutils.exe for the Spark-Hadoop prebuilt package
then open a cmd prompt as Administration, run this command:
winutils chmod 777 C:\tmp\hive
Note: The drive might be different depending on where you invoke pyspark or spark-shell
This link should take the credit: see the answer by timesking