New to Spark. Downloaded everything alright but when I run pyspark I get the following errors:
Type \"help\", \"copyright\", \"credits\" or \"license\" for more
If you're on a Mac and you've installed Spark (and eventually Hive) through Homebrew the answers from @Eric Pettijohn and @user7772046 will not work. The former due to the fact that Homebrew's Spark contains the aforementioned jar file; the latter because, trivially, it is a pure Windows-based solution.
Inspired by this link and the permission issues hint, I've come up with the following simple solution: launch pyspark using sudo. No more Hive-related errors.